Download OpenAPI specification:Download
This page contains the API documentation for Enot radar software. HTTP 1.1 and Websocket protocols are used for interacting with software. Websockets used for streaming of continuous time-synchronized data like trajectories, radar data and video.
Content-length
header are required for POST
requests.identity
and gzip
are supported for Content-enconding
headerRequests with optional modified_since
parameter (like Get configuration) support long-polling, allowing to get reply only if data was changed since last request. That allows to reduce number of requests and size of data transmitted over the network.
modified_since=null
.eTag
http response header.eTag
from last response passed to modified_since
parameter.Each application implements one or several apis:
API | Description | Service |
---|---|---|
Client To Server API | Client requests related to sources information, archive functionality, alarm zones and events | radar_server |
Source To Server API | Internal requests used by sources | enot_dsp , lynx_dsp , video_service |
Streaming API | Continuous data streaming using WebSockets | radar_server |
Configuration API | Configuration requests | radar_server , enot_dsp , lynx_dsp |
Enot Source API | Enot radar source requests | enot_dsp |
Video Source API | Video source requests | video_service |
Features:
Most of SDK sources generated based on this schema with OpenAPI generator project using custom generation templates. Generators for other languages are available.
More detailed SDK description available here.
Delete or temporary disable source
source_id required | integer <int64> (Id) Example: source_id=2 |
disable required | boolean False - deleting source from the server with all archive data. True - temporary disabling source. It will be enabled on next registration request. |
source_id required | integer <int64> (Id) Example: source_id=2 |
time_from required | string <date-time> (Time) Example: time_from=2021-07-24T08:20:49.483Z Response containing telemetry starting from this time. |
time_to required | string <date-time> (Time) Example: time_to=2021-07-24T08:20:49.483Z Response containing telemetry until this time |
columns required | string List of columns to return (comma separated) |
rows_limit required | integer Limit for response rows count |
{- "time": [
- "2021-07-24T08:20:49.483Z"
], - "columns": {
- "property1": {
- "id": "string",
- "locales": {
- "en-US": "String",
- "ru-RU": "Строка"
}, - "data": [
- 0
]
}, - "property2": {
- "id": "string",
- "locales": {
- "en-US": "String",
- "ru-RU": "Строка"
}, - "data": [
- 0
]
}
}
}
modified_since | string <date-time> (Time) Example: modified_since=2021-07-24T08:20:49.483Z If the parameter is used, the server responds only when the data changes (see long polling requests). |
{- "all_sources": [
- 2
], - "sources": [
- {
- "uuid": "a25009df-b197-420d-8824-546c80e006ba",
- "name": "Roof radar",
- "type": "enot_radar",
- "address": "127.0.0.1",
- "id": 2,
- "disabled": true,
- "state": "ok",
- "port": 0
}
]
}
time_from | string <date-time> (Time) Example: time_from=2021-07-24T08:20:49.483Z Response containing archive map starting from this time. |
time_to | string <date-time> (Time) Example: time_to=2021-07-24T08:20:49.483Z Response containing archive map until this time |
{- "server": {
- "streams": [
- {
- "stream_type": "stream_config",
- "chunks": [
- {
- "chunk_id": 1,
- "start": "2021-07-24T08:20:49.483Z",
- "end": "2021-07-24T08:21:49.483Z"
}, - {
- "chunk_id": 2,
- "start": "2021-07-24T08:22:49.483Z",
- "end": "2021-07-24T08:23:49.483Z"
}
]
}
]
}, - "sources": [
- {
- "source_id": 2,
- "streams": [
- {
- "stream_type": "stream_config",
- "chunks": [
- {
- "chunk_id": 1,
- "start": "2021-07-24T08:20:49.483Z",
- "end": "2021-07-24T08:21:49.483Z"
}, - {
- "chunk_id": 2,
- "start": "2021-07-24T08:22:49.483Z",
- "end": "2021-07-24T08:23:49.483Z"
}
]
}
]
}
]
}
modified_since | string <date-time> (Time) Example: modified_since=2021-07-24T08:20:49.483Z If the parameter is used, the server responds only when the data changes (see long polling requests). |
{- "zones": {
- "property1": {
- "id": 2,
- "name": "string",
- "enabled": true,
- "area": {
- "points": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "min_altitude": 0,
- "max_altitude": 0
}, - "camera_aim_rules": {
- "classes_priorities": [
- [
- "string"
]
], - "min_track_length": 0
}, - "jammer_rules": {
- "enabled": true,
- "classes": [
- "string"
], - "min_track_length": 0
}, - "alarm_rules": {
- "classes": [
- "string"
], - "min_track_length": 0,
- "min_video_detections": 0
}
}, - "property2": {
- "id": 2,
- "name": "string",
- "enabled": true,
- "area": {
- "points": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "min_altitude": 0,
- "max_altitude": 0
}, - "camera_aim_rules": {
- "classes_priorities": [
- [
- "string"
]
], - "min_track_length": 0
}, - "jammer_rules": {
- "enabled": true,
- "classes": [
- "string"
], - "min_track_length": 0
}, - "alarm_rules": {
- "classes": [
- "string"
], - "min_track_length": 0,
- "min_video_detections": 0
}
}
}
}
{- "id": 2,
- "name": "string",
- "enabled": true,
- "area": {
- "points": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "min_altitude": 0,
- "max_altitude": 0
}, - "camera_aim_rules": {
- "classes_priorities": [
- [
- "string"
]
], - "min_track_length": 0
}, - "jammer_rules": {
- "enabled": true,
- "classes": [
- "string"
], - "min_track_length": 0
}, - "alarm_rules": {
- "classes": [
- "string"
], - "min_track_length": 0,
- "min_video_detections": 0
}
}
id | integer <int64> (Id) |
name required | string |
enabled required | boolean |
required | object (GeoArea) |
required | object (CameraAimRules) |
required | object (JammerRules) |
required | object (AlarmRules) |
{- "id": 2,
- "name": "string",
- "enabled": true,
- "area": {
- "points": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "min_altitude": 0,
- "max_altitude": 0
}, - "camera_aim_rules": {
- "classes_priorities": [
- [
- "string"
]
], - "min_track_length": 0
}, - "jammer_rules": {
- "enabled": true,
- "classes": [
- "string"
], - "min_track_length": 0
}, - "alarm_rules": {
- "classes": [
- "string"
], - "min_track_length": 0,
- "min_video_detections": 0
}
}
modified_since | string <date-time> (Time) Example: modified_since=2021-07-24T08:20:49.483Z If the parameter is used, the server responds only when the data changes (see long polling requests). |
time_from | string <date-time> (Time) Example: time_from=2021-07-24T08:20:49.483Z Response containing events starting from this time. |
time_to | string <date-time> (Time) Example: time_to=2021-07-24T08:20:49.483Z Response containing events until this time |
{- "events": {
- "property1": {
- "event_id": 2,
- "zone_id": 2,
- "start_time": "2021-07-24T08:20:49.483Z",
- "end_time": "2021-07-24T08:20:49.483Z",
- "decision": {
- "type": "friend",
- "comment": "string"
}, - "track_ids": [
- {
- "source_id": 2,
- "track_id": 2
}
]
}, - "property2": {
- "event_id": 2,
- "zone_id": 2,
- "start_time": "2021-07-24T08:20:49.483Z",
- "end_time": "2021-07-24T08:20:49.483Z",
- "decision": {
- "type": "friend",
- "comment": "string"
}, - "track_ids": [
- {
- "source_id": 2,
- "track_id": 2
}
]
}
}
}
event_id required | integer <int64> (Id) |
type required | string (AlarmEventDecisionType) Enum: "friend" "stranger" "threat" "false_alarm" |
comment required | string |
{- "event_id": 2,
- "type": "friend",
- "comment": "string"
}
modified_since | string <date-time> (Time) Example: modified_since=2021-07-24T08:20:49.483Z |
event_id | integer <int64> (Id) Example: event_id=2 |
time_from | string <date-time> (Time) Example: time_from=2021-07-24T08:20:49.483Z Response containing events starting from this time. |
time_to | string <date-time> (Time) Example: time_to=2021-07-24T08:20:49.483Z Response containing events until this time |
{- "event_tracks": {
- "property1": {
- "event_id": 2,
- "track_length": 0,
- "start_time": "2021-07-24T08:20:49.483Z",
- "end_time": "2021-07-24T08:20:49.483Z",
- "track_id": {
- "source_id": 2,
- "track_id": 2
}, - "video_detections_ids": [
- 2
]
}, - "property2": {
- "event_id": 2,
- "track_length": 0,
- "start_time": "2021-07-24T08:20:49.483Z",
- "end_time": "2021-07-24T08:20:49.483Z",
- "track_id": {
- "source_id": 2,
- "track_id": 2
}, - "video_detections_ids": [
- 2
]
}
}
}
Source must be registered with generated UUID before sending other requests.
uuid required | string (UUID) Unique 128-bit resource identificator |
name required | string (SourceName) |
type required | string (SourceType) Enum: "enot_radar" "camera" "lynx_bearing" "dji" "skyhunter" "kuntsevo_radar" |
address required | string (IPV4Address) |
{- "uuid": "a25009df-b197-420d-8824-546c80e006ba",
- "name": "Roof radar",
- "type": "enot_radar",
- "address": "127.0.0.1"
}
2
Configuration API provides flexible way of reading telemetry and reading/setting configuration. Format contains all information (item type, text translations, enum values, attributes) needed to generate configuration UI. List of supported configuration groups and properties depends on service type:
modified_since | string <date-time> (Time) Example: modified_since=2021-07-24T08:20:49.483Z If the parameter is used, the server responds only when the data changes (see long polling request schema). |
{- "all_groups": [
- "string"
], - "groups": [
- {
- "id": "group_id",
- "name": {
- "en-US": "String",
- "ru-RU": "Строка"
}, - "all_items": [
- "string"
], - "items": [
- {
- "id": "string",
- "name": {
- "en-US": "String",
- "ru-RU": "Строка"
}, - "enum_items": [
- {
- "id": "string",
- "name": {
- "en-US": "String",
- "ru-RU": "Строка"
}
}
], - "value": {
- "type": "bool",
- "data": "string"
}, - "attributes": {
- "readonly": true,
- "status": "ok",
- "applying": true,
- "min": 0,
- "max": 0,
- "step": 0,
- "spin_step": 0,
- "decimals": 0,
- "echo_mode": true,
- "regexp": "string",
- "exclude_from_config": true,
- "exclude_from_export": true,
- "internal": true,
- "hidden": true,
- "log_to_db": true
}
}
]
}
]
}
Validates and set configuration. Operation cancelled if validation of any property failed.
Configuration values.
required | object Key contains group id |
{- "groups": {
- "property1": {
- "values": {
- "property1": {
- "type": "bool",
- "data": "string"
}, - "property2": {
- "type": "bool",
- "data": "string"
}
}
}, - "property2": {
- "values": {
- "property1": {
- "type": "bool",
- "data": "string"
}, - "property2": {
- "type": "bool",
- "data": "string"
}
}
}
}
}
radar_server
:"enabled_categories"
"enabled"
"data_directory"
Configuration is used in generating of trajectory merger frames. Value of tracks_merger
is a JSON string of MergerConfiguration
format:
merged_sources required | Array of integers (Id) unique [ items <int64 > unique [ items <int64 > ] ] Subsets of sources which tracks can be merged in one group. |
bearing_intersections_merger_distance required | number <float> Merge distance for two bearing intersection areas. |
point_merger_distance required | number <float> Merge distance for tracks with known coordiantes. |
{- "merged_sources": [
- [
- 2
]
], - "bearing_intersections_merger_distance": 0.1,
- "point_merger_distance": 0.1
}
enot_dsp
."enabled_categories"
"source_name"
"write_dump"
"latitude"
"adapter_id"
"ip"
"range_spreading_losts"
"count"
"detection_point_power_min"
"enabled"
"history_size"
"lifetime_azimuth_1"
"enabled"
"enabled"
"surface_config"
Value of surface_config
is a JSON string of TerrainSurfaceConfig
format:
url required | string Schematic surface tiles server with |
level required | integer Zoom level. |
water_hue_min required | integer Min water color hue. |
water_hue_max required | integer Max water color hue. |
{- "level": 0,
- "water_hue_min": 0,
- "water_hue_max": 0
}
Value of altitude_config
is a JSON string of TerrainAltitudeConfig
format:
url required | string Altitude tiles server with |
level required | integer Zoom level. |
{- "level": 0
}
"enabled"
"min_track_length"
"distance_between_antennas"
Value of zones_filter
is a JSON string of FilterAreas
format:
required | Array of objects (FilterArea) |
{- "areas": [
- {
- "name": "string",
- "enabled": true,
- "area": {
- "points": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "min_altitude": 0,
- "max_altitude": 0
}
}
]
}
"processing_overflow"
"frames_per_second"
"ignore_errors"
"send_ip"
"enabled"
"transceiver"
"serial_number"
"disable_sector_1"
"heterodin_start"
"transceiver_state"
"startup_state"
"min_mdm_power"
"state"
lynx_dsp
."enabled_categories"
"source_name"
"write_dump"
"latitude"
"adapter_id"
"ip"
"ffadc"
"max_object_azimuth_width"
"track_lifetime"
"enabled"
"history"
"processing_overflow"
"max_interval_loss"
source_id required | integer <int64> (Id) |
track_id required | integer <int64> Trajectory id (unique for each source) |
{- "source_id": 2,
- "track_id": 2
}
{- "coords": {
- "coords": {
- "pan": -1,
- "tilt": -1
}, - "zoom": 1
}, - "angles": {
- "pan": -180,
- "tilt": -90
}, - "geo_angles": {
- "pan": -180,
- "tilt": -90
}, - "first_fov": {
- "h": 0,
- "v": 0
}, - "second_fov": {
- "h": 0,
- "v": 0
}, - "focus": 1,
- "is_moving": true,
- "is_locked": true,
- "jammer_state": true,
- "autojammer_state": true,
- "jamming_time": "2019-08-24T14:15:22Z",
- "glass_wipper_state": true
}
pan required | number Pan angular speed (degrees/sec). |
tilt required | number Tilt angular speed (degrees/sec). |
zoom required | number Zoom angular speed (degrees/sec). |
{- "pan": 0,
- "tilt": 0,
- "zoom": 0
}
object (PTCoords) Normalized PT coordinates. | |
zoom | number [ 0 .. 1 ] |
required | object (PTZSpeeds) Normalized PTZ speeds. |
{- "coords": {
- "pan": -1,
- "tilt": -1
}, - "zoom": 1,
- "speeds": {
- "pan": -1,
- "tilt": -1,
- "zoom": -1
}
}
required | object (GeoPosition) |
target_diameter required | number Diameter of target sphere (meters). Zoom is chosen so that the target occupies the entire horizontal field of view. |
fov_scale | number Optional scale for FOV. |
required | object (PTZAngularSpeeds) |
{- "target_position": {
- "latitude": 55.986697,
- "longitude": 37.214795,
- "altitude": 23.52
}, - "target_diameter": 0,
- "fov_scale": 0,
- "speeds": {
- "pan": 0,
- "tilt": 0,
- "zoom": 0
}
}
Move camera, so its center will be looking on certain point of a video frame
required | object (Point) |
second_video | boolean True for moving relative to second video source |
{- "position": {
- "x": 0,
- "y": 0
}, - "second_video": true
}
Websockets used for transmitting continuous data streams. Each connection can transmit several stream types of one source. Client can request realtime or archive data. All streams inside one connection are synchronized by time.
Each websocket message received from the server, or sent by client contains:
byte 0
: Stream type.byte 1
: Frame type.byte 2
: Serialization formatbyte 3
: If 1
, frame body is compressed with GZIP.bytes 4-11
: Frame creation datetime as the number of milliseconds that have passed since 1970-01-01T00:00:00, UTC (signed 64-bit integer).Supported frame serialization formats:
Format | Value | Encoding details |
---|---|---|
json | 0 | Text serialization format.
|
cbor | 1 | Binary serialization format based on JSON. [Overview] [Implementations].
|
It is highly recommended not to use the JSON format for sent and received frames, because:
Anyway, JSON format is available for debug / fast integration purposes.
Message header byte, telling about type of frame (format or data) and data type that message body contains.
Frame Type | Description |
---|---|
0 | Format frame (optional for some streams). Client must save and use format to decode/visualize data frames. Sent by the server when:
|
>= 1 | Data frames. Possible frame types depends on stream type. |
CBOR / header value | JSON value | Source type | Header frame type and related message body types | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | stream_config | Any |
|
||||||||||
1 | trajectories | enot_radar , camera , lynx_bearing , dji , skyhunter |
|
||||||||||
2 | video | camera |
|
||||||||||
3 | video_metadata | camera |
|
||||||||||
4 | enot_detection_points | enot_radar |
|
||||||||||
5 | enot_intensity_map | enot_radar |
|
||||||||||
6 | enot_clutter_map | enot_radar |
|
||||||||||
7 | enot_spectrum_map | enot_radar |
|
||||||||||
8 | enot_spectrum_data | enot_radar |
|
||||||||||
9 | lynx_spectrum | lynx_bearing |
|
||||||||||
10 | second_video | camera |
For second camera head (infared).
|
||||||||||
11 | second_video_metadata | camera |
For second camera head (infared).
|
||||||||||
12 | enot_noise_map | enot_radar |
|
||||||||||
13 | video_analyzer | camera |
Video analyzer results.
|
||||||||||
14 | second_video_analyzer | camera |
Second camera head (infared) analyzer results.
|
object (ReadConfig) Used by client for initializing WS connection for stream reading. | |
Array of objects (ReadStreamConfig) unique Configuration of reading streams | |
seek_to | string <date-time> Playback seek time. Pass |
speed | number >= 0 Playback speed. Passing |
{- "read_config": {
- "source_id": 2,
- "serialization_format": "json",
- "gzip_compression": true
}, - "read_streams": [
- {
- "type": "stream_config",
- "export_depth": 0
}
], - "seek_to": "2021-07-24T08:20:49.483Z",
- "speed": 0
}
end_of_export | string Enum: "stream_config" "trajectories" "video" "video_metadata" "enot_detection_points" "enot_intensity_map" "enot_clutter_map" "enot_spectrum_map" "enot_spectrum_data" "lynx_spectrum" "second_video" "second_video_metadata" "enot_noise_map" "video_analyzer" "second_video_analyzer" Indicates that the export of the specified stream has completed. See |
end_of_chunk | string Enum: "stream_config" "trajectories" "video" "video_metadata" "enot_detection_points" "enot_intensity_map" "enot_clutter_map" "enot_spectrum_map" "enot_spectrum_data" "lynx_spectrum" "second_video" "second_video_metadata" "enot_noise_map" "video_analyzer" "second_video_analyzer" Indicates that playback of the specified stream has reached the end of archive chunk. |
{- "end_of_export": "stream_config",
- "end_of_chunk": "stream_config"
}
azimuth required | number <float> [ 0 .. 360 ] Current antenna azimuth (degrees). |
required | Array of objects (EnotDetectionPoint) >= 0 items |
{- "azimuth": 360,
- "points": [
- {
- "range": 0.1,
- "azimuth": 360,
- "elevation": -90,
- "normalized_power": 1,
- "normalized_speed": 1,
- "mask": "string"
}
]
}
range_step required | number <float> Size of one ray element (meters) |
start_range required | number <float> First ray element range (meters) |
range_size required | integer <int32> Number of ray elements |
azimuth_size required | integer <int32> Number of azimutal elements |
{- "range_step": 0.1,
- "start_range": 0.1,
- "range_size": 0,
- "azimuth_size": 0
}
azimuth required | integer <int32> Current ray azimuth |
ray_data required | string <binary> Buffer, contiaing Each unsigned byte encodes coloring mode and intensity. Values 0..192 are colored intensity and the rest 192..255 are grayscaled intensity.
|
{- "azimuth": 0,
- "ray_data": "string"
}
azimuth_size required | integer <int32> Number of azimutal elements |
range_size required | integer <int32> Number of ray elements |
range_step required | number <float> Size of one ray element (meters) |
start_range required | number <float> First ray element range (meters) |
speed_size required | integer <int32> Number of speed elements |
speed_step required | number <float> Size of one speed element (msec) |
threshold required | integer <int32_t> Number of detections needed to filter out cell. |
{- "azimuth_size": 0,
- "range_size": 0,
- "range_step": 0.1,
- "start_range": 0.1,
- "speed_size": 0,
- "speed_step": 0.1,
- "threshold": 0
}
azimuth required | integer <int32> Current ray azimuth |
mask required | string <binary> Buffer size is |
detections required | string <binary> Buffer size is |
{- "azimuth": 0,
- "mask": "string",
- "detections": "string"
}
"empty"
start_range required | number <float> Start range (meters) |
end_range required | number <float> End range (meters) |
azimuth_size required | integer <int32> Number of azimutal elements |
{- "start_range": 0.1,
- "end_range": 0.1,
- "azimuth_size": 0
}
azimuths required | string <binary> Buffer size is |
{- "azimuths": "string"
}
required | object (EnotSpectrumFormat) |
min_incremental_azimuth required | integer <int64> |
max_incremental_azimuth required | integer <int64> |
power required | Array of numbers <float> [ items <float > ] Power (DB). |
cfar_threshold | Array of numbers or null <float> [ items <float > ] Cfar threshold (DB). |
compensation_power | Array of numbers <float> [ items <float > ] Compensation channel power (DB). |
altitude | Array of numbers <float> [ items <float > ] Altitude (meters). |
mask required | string <binary> Buffer size is |
{- "format": {
- "azimuth_size": 0,
- "range_size": 0,
- "range_step": 0.1,
- "start_range": 0.1,
- "speed_size": 0,
- "speed_step": 0.1,
- "azimuth": 0
}, - "min_incremental_azimuth": 0,
- "max_incremental_azimuth": 0,
- "power": [
- 0.1
], - "cfar_threshold": [
- 0.1
], - "compensation_power": [
- 0.1
], - "altitude": [
- 0.1
], - "mask": "string"
}
8-bit mask, containing detection and filtering algorithms flags.
bit | description |
---|---|
0 | Analog to digital converter overflow. False or missing CFAR detections possible. |
1 | Is point passed CFAR detection filter. The meaning of the remaining bits depends on this value. |
If CFAR detection filter not passed (bit 1 is 0):
bit | description |
---|---|
2 | Point removed by central band filter. |
3 | Point removed by central band power filter. |
If CFAR detection filter passed (bit 1 is 1):
bit | description |
---|---|
2 | Point filtered by compensaton channel filter. |
3 | Point filtered by clutter filter. |
4 | Point filtered by clutter map filter. |
5 | Point filtered by zones filter. |
6 | Point became part of the target. |
7 | Point became most intense part of the target. |
frequency_start required | integer Begin of device frequency band (MHz) |
frequency_end required | integer End of device frequency band (MHz) |
power required | Array of numbers <float> [ items <float > ] Power for each frequency (DB). |
threshold required | Array of numbers <float> [ items <float > ] Threshold for each frequency (DB). |
mask required | string <binary> One byte with LynxSpectrumMask for each frequency. |
{- "frequency_start": 0,
- "frequency_end": 0,
- "power": [
- 0.1
], - "threshold": [
- 0.1
], - "mask": "string"
}
8-bit mask, containing detection and filtering algorithms flags.
bit | description |
---|---|
0 | Analog to digital converter overflow. False or missing CFAR detections possible. |
1 | Is point passed CFAR detection filter. |
2 | Point is removed by clutter filter. |
3 | Point is removed by interval filter. |
4 | Point became part of the target. |
5 | Point became most intense part of the target. |
Client should store trajectory data and append/update/remove it when new stream frames received.
time required | string <date-time> Changes time. |
required | Array of objects (TrajectoryPoint) New/appended trajectory points. |
removed_tracks required | Array of integers <int64> (Id) unique [ items <int64 > ] Array of removed trajectories. There will be no more updates of these trajectories. |
remove_all | boolean If |
{- "time": "2021-07-24T08:20:49.483Z",
- "new_points": [
- {
- "id": {
- "track_id": 2,
- "point_id": 2
}, - "detection_time": {
- "realtime": "2021-07-24T08:20:49.483Z",
- "archive": "2021-07-24T08:20:49.483Z"
}, - "enot_data": {
- "detection": {
- "speed_width": 0.1,
- "azimuth_width": 0.1,
- "cfar_points_count": 0.1,
- "central_band_power": 0.1,
- "main_interval_sum_power": 0.1,
- "power_max": 0.1,
- "power_sum": 0.1,
- "all_intervals_sum_of_max_power": 0.1,
- "power_offset": 0.1,
- "near_range_power_correction": 0.1,
- "cfar_noise": 0.1,
- "min_cfar_coefficient": 0.1,
- "median_noise": 0.1,
- "ratio_to_compensation": 0.1
}, - "position": {
- "position": {
- "latitude": 55.986697,
- "longitude": 37.214795,
- "altitude": 23.52
}, - "range": 0.1,
- "radar_azimuth": 0.1,
- "gps_azimuth": 0.1,
- "elevation": 0.1,
- "elevation_rmse": 0.1,
- "altitude": 0.1,
- "kalman_offset": 0.1,
- "altitude_rmse": 0.1,
- "source_terrain_altitude": 0.1,
- "terrain_altitude": 0.1,
- "altitude_above_sea": 0.1,
- "altitude_above_terrain": 0.1,
- "radial_speed": 0.1,
- "track_radial_speed": 0.1,
- "track_tangential_speed": 0.1,
- "track_speed": 0.1,
- "course": 0.1
}, - "tracking": {
- "antenna_turnover": 0,
- "rcs": 0.1,
- "debug_rcs": {
- "main_iterval_sum": 0.1,
- "main_iterval_max": 0.1,
- "all_intervals_sum": 0.1,
- "all_intervals_sum_of_max": 0.1
}, - "track_avg_rcs": 0.1,
- "debug_track_rcs": {
- "avg": 0.1,
- "median": 0.1,
- "median_avg": 0.1
}, - "class_name": "string",
- "penalty": [
- {
- "point": {
- "track_id": 2,
- "point_id": 2
}, - "total": 0.1,
- "position": 0.1,
- "position_weakening": 0.1,
- "doppler_speed": 0.1,
- "track_radial_speed": 0.1,
- "track_tangential_speed": 0.1,
- "rcs": 0.1
}
]
}, - "spectrum": {
- "azimutal_segments": [
- {
- "format": {
- "azimuth_size": 0,
- "range_size": 0,
- "range_step": 0.1,
- "start_range": 0.1,
- "speed_size": 0,
- "speed_step": 0.1,
- "azimuth": 0
}, - "min_incremental_azimuth": 0,
- "max_incremental_azimuth": 0,
- "power": [
- 0.1
], - "cfar_threshold": [
- 0.1
], - "compensation_power": [
- 0.1
], - "altitude": [
- 0.1
], - "mask": "string"
}
], - "max_power_segment_index": 0
}, - "unreliable": true
}, - "bearing_target": {
- "geometry": {
- "source_position": {
- "latitude": 55.986697,
- "longitude": 37.214795
}, - "range": 0.1,
- "azimuth": {
- "value": 0.1,
- "standard_deviation": 0.1
}
}, - "lynx_target": {
- "frequency": 0.1,
- "bandwidth": 0,
- "power": 0.1,
- "snr": 0.1,
- "doa": [
- 0.1
], - "spectrum": {
- "frequency_start": 0,
- "frequency_end": 0,
- "power": [
- 0.1
], - "threshold": [
- 0.1
], - "mask": "string"
}
}, - "skyhunter_target": {
- "target_type": "controller",
- "comm_system": "string",
- "device_id": "string",
- "detection_id": 0,
- "source": "string",
- "frequency": 0.1,
- "bandwidth": 0.1,
- "snr": 0.1,
- "is_enemy": true
}
}, - "coordinate_target": {
- "position": {
- "latitude": 55.986697,
- "longitude": 37.214795,
- "altitude": 23.52
}, - "dji_server_target": {
- "info": "string"
}, - "skyhunter_target": {
- "info": {
- "target_type": "controller",
- "comm_system": "string",
- "device_id": "string",
- "detection_id": 0,
- "source": "string",
- "frequency": 0.1,
- "bandwidth": 0.1,
- "snr": 0.1,
- "is_enemy": true
}, - "drone_id": "string"
}
}
}
], - "removed_tracks": [
- 2
], - "remove_all": true
}
Interpolated positions, related to specific trajectory point. Interpolated positions appended or replaced when new points added to trajectory.
required | Array of objects non-empty |
{- "interpolation": [
- {
- "id": {
- "track_id": 2,
- "point_id": 2
}, - "rewrite": true,
- "positions": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795,
- "altitude": 23.52
}
]
}
]
}
Additional information, related to specific trajectory point. Generated by the server, can arrive with delay after trajectory point.
required | object (TrajectoryPointId) |
alarm_events | Array of integers <int64> (Id) unique [ items <int64 > ] |
video_detections | Array of integers <int64> (Id) unique [ items <int64 > ] |
{- "point_id": {
- "track_id": 2,
- "point_id": 2
}, - "alarm_events": [
- 2
], - "video_detections": [
- 2
]
}
Tracks from different sources caused by one real target are placed into one group according to merger configuration. Client can display only main track in each group, to avoid trajectories duplication.
Merger frame also includes geometry information about Lynx bearing intersections.
Merger frames are generated by the server. To get them client must subscribe radar_server
trajectories stream by passing empty source_id
in ReadConfig when initializing reading stream.
required | Array of objects (TrajectoryMergerGroup) non-empty unique New/updated merger groups. |
removed_groups required | Array of integers <int64> (Id) unique [ items <int64 > ] Array of removed merger groups. There will be no more updates of these groups. |
remove_all required | boolean If |
required | Array of objects (BearingMergerMetadata) |
{- "new_groups": [
- {
- "id": 2,
- "unreliable": true,
- "tracks": [
- {
- "source_id": 2,
- "track_id": 2
}
], - "main_track": {
- "source_id": 2,
- "track_id": 2
}, - "bearing_intersections": [
- [
- {
- "source_id": 2,
- "track_id": 2
}
]
]
}
], - "removed_groups": [
- 2
], - "remove_all": true,
- "bearing_metadata": [
- {
- "intersection_pairs": [
- [
- {
- "source_id": 2,
- "track_id": 2
}
]
], - "merged_position": {
- "area": [
- {
- "latitude": 55.986697,
- "longitude": 37.214795
}
], - "position": {
- "latitude": 55.986697,
- "longitude": 37.214795
}, - "radius": 0.1
}, - "avg_frequency_todo": "string"
}
]
}
Video stream can be decoded using ffmpeg library.
codec_id required | integer <int32> |
width required | integer <int32> |
height required | integer <int32> |
data required | string <binary> Contains AVCodecParameters::extradata |
{- "codec_id": 0,
- "width": 0,
- "height": 0,
- "data": "string"
}
Equals to AVCodecParameters
in ffmpeg library.
Example of codec initialization from a VideoFormat
structure:
std::optional<Codec> AvCodecFromVideoFormat(const oapi::VideoFormat& video_format)
{
AVCodecParameters* av_codec_params_ptr = avcodec_parameters_alloc();
auto& av_codec_params = *av_codec_params_ptr;
av_codec_params.codec_type = AVMediaType::AVMEDIA_TYPE_VIDEO;
av_codec_params.codec_id = static_cast<AVCodecID>(video_format.codec_id);
av_codec_params.width = video_format.width;
av_codec_params.height = video_format.height;
av_codec_params.format = AVPixelFormat::AV_PIX_FMT_YUV420P;
av_codec_params.extradata = reinterpret_cast<uint8_t*>(
av_malloc(video_format.data.size() + AV_INPUT_BUFFER_PADDING_SIZE));
av_codec_params.extradata_size = video_format.data.size();
memcpy(av_codec_params.extradata, video_format.data.data(), video_format.data.size());
Codec codec;
codec.av_codec = avcodec_find_decoder(av_codec_params.codec_id);
if (!codec.av_codec)
{
avcodec_parameters_free(&av_codec_params_ptr);
return std::nullopt;
}
codec.av_codec_ctx = avcodec_alloc_context3(codec.av_codec);
if (!codec.av_codec_ctx)
{
avcodec_parameters_free(&av_codec_params_ptr);
return std::nullopt;
}
auto res = avcodec_parameters_to_context(codec.av_codec_ctx, av_codec_params_ptr);
if (res < 0)
{
avcodec_parameters_free(&av_codec_params_ptr);
avcodec_close(codec.av_codec_ctx);
return std::nullopt;
}
// Open codec
res = avcodec_open2(codec.av_codec_ctx, codec.av_codec, nullptr);
if (res < 0)
{
avcodec_parameters_free(&av_codec_params_ptr);
avcodec_close(codec.av_codec_ctx);
return std::nullopt;
}
avcodec_parameters_free(&av_codec_params_ptr);
return codec;
}
key_frame required | boolean |
time required | string <date-time> (Time) |
data required | string <binary> |
{- "key_frame": true,
- "time": "2021-07-24T08:20:49.483Z",
- "data": "string"
}
Equals to AVPacket
in ffmpeg library.
Example of packet initialization from a VideoPacket
structure:
AVPacket AvPacketFromFrame(oapi::VideoPacket& packet)
{
AVPacket av_packet;
av_init_packet(&av_packet);
av_packet.data = reinterpret_cast<uint8_t*>(packet.data.data());
av_packet.size = packet.data.size();
av_packet.flags = packet.key_frame ? AV_PKT_FLAG_KEY : 0;
return av_packet;
}
video_frame_time_utc_ms required | integer <int64> Time of related video frame |
object (PTZState) | |
object (SourceTrajectoryPointId) Trajectory point at which camera was aimed |
{- "video_frame_time_utc_ms": 0,
- "ptz_state": {
- "coords": {
- "coords": {
- "pan": -1,
- "tilt": -1
}, - "zoom": 1
}, - "angles": {
- "pan": -180,
- "tilt": -90
}, - "geo_angles": {
- "pan": -180,
- "tilt": -90
}, - "first_fov": {
- "h": 0,
- "v": 0
}, - "second_fov": {
- "h": 0,
- "v": 0
}, - "focus": 1,
- "is_moving": true,
- "is_locked": true,
- "jammer_state": true,
- "autojammer_state": true,
- "jamming_time": "2019-08-24T14:15:22Z",
- "glass_wipper_state": true
}, - "aimed_trajectory": {
- "source_id": 2,
- "track_id": 2,
- "point_id": 2
}
}
state | string (VideoAnalyzerState) Enum: "error" "disabled" "enabled" Used only in VideoService analyzer |
date required | string <date-time> Time of analyzed video frame |
Array of objects (VideoAnalyzerObject) | |
object Classification index for each model | |
object Classification label for each model | |
object Score prediction for each model |
{- "state": "error",
- "date": "2019-08-24T14:15:22Z",
- "objs": [
- {
- "id": "497f6eca-6276-4993-bfeb-53cbbbba6f08",
- "bbox": [
- 0
], - "keypoints": [
- [
- 0
]
], - "segmentation": [
- [
- 0
]
], - "category_id": 0,
- "category": "string",
- "track_id": 0,
- "label": "string",
- "score": 0.1,
- "objs": [
- "497f6eca-6276-4993-bfeb-53cbbbba6f08"
]
}
], - "category_id": {
- "property1": 0,
- "property2": 0
}, - "category": {
- "property1": "string",
- "property2": "string"
}, - "score": {
- "property1": 0.1,
- "property2": 0.1
}
}
├───libs │ ├───utils - helper classes and functions │ ├───http_types - base HTTP types │ ├───http_client - HTTP client library │ ├───oapi_models - Generated OpenAPI structures, serialization / deserialization │ ├───oapi_client - Generated API's for client implementation │ └───streaming_api - WebSockets stream, message header, frame creation and parsing └───examples ├───1_single_request ├───2_polling_subscription ├───3_long_polling_subscription ├───4_source_configuration ├───5_reading_radar_streams ├───6_archive_export └───7_serialization_format_comparison
Code examples below based on SDK examples.
Each model represented with oapi::<name>
structure.
std::optional<T>
std::vector
std::set<T>
std::map<QString, T>
enum class
oapi::ReadStreamRequests request;
request.seek_to = QDateTime::currentDateTimeUtc().addSecs(-30);
request.speed = 0.5;
// default serialization
QByteArray cbor_gzip_buffer =
streaming_api::Serialize(request,
Header(oapi::StreamType::E_STREAM_CONFIG,
utils::to_underlying(oapi::CommonFrameType::E_DATA)));
// manually set time, format & compression
QByteArray json_buffer =
streaming_api::Serialize(request,
Header(oapi::StreamType::E_STREAM_CONFIG,
utils::to_underlying(oapi::CommonFrameType::E_DATA),
QDateTime::currentDateTimeUtc().toMSecsSinceEpoch(),
false, // disable gzip compression
oapi::SerializationFormat::E_JSON));
QByteArray buffer = ...;
oapi::TrajectoryChangesFrame traj;
std::optional<Error> error = streaming_api::Deserialize(traj, buffer);
if (error)
{
qDebug() << *error;
return;
}
qDebug() << traj.time.toString(Qt::ISODateWithMs);
streaming_api::Stream
class provides websocket stream with easy configuration and reconnect handling.
...
auto stream = new streaming_api::Stream(owner);
// Open connection
stream->open(streams_address);
// Set handler for incoming stream frames & playback state
QObject::connect(stream, &streaming_api::Stream::onMessage, owner, &ProcessMessage);
//Configure as reading stream with default format and compression
stream->initRead({source_id, std::nullopt, std::nullopt});
//Load 60 seconds of trajectories archive to restore scene before starting a playback
const int kExportDepthSecs = 60;
//Set requested streams & their configurations
stream->readStreams({{oapi::StreamType::E_TRAJECTORIES, kExportDepthSecs},
{oapi::StreamType::E_ENOT_INTENSITY_MAP, std::nullopt}});
//Configure playback
stream->playback()->setSpeed(0.5);
stream->playback()->seek(QDateTime::currentDateTimeUtc().addSecs(-60));
//for switch back to realtime: stream->playback()->seekToRealtime();
...
void ProcessMessage(const QByteArray& message)
{
//Parse frame depending on StreamType and FrameType
const streaming_api::Header* header = streaming_api::GetHeader(message);
switch (header->stream_type)
{
case oapi::StreamType::E_TRAJECTORIES:
{
if (header->frame_type == utils::to_underlying(oapi::TrajectoryFrameType::E_CHANGES))
{
oapi::TrajectoryChangesFrame frame;
auto error = streaming_api::Deserialize(frame, message);
if (error)
{
qDebug() << "TrajectoryChangesFrame deserialization error:" << *error;
return;
}
if (frame.new_points)
{
for (auto& p : *frame.new_points)
{
qDebug() << "Trajectory" << p.id.track_id << "updated, position:" << toJson(p.position);
}
}
}
//handle other frame types
...
}
//handle other stream types
...
Each API represented with oapi::<name>API
class:
#include <oapi_client/ClientToServerApi.h>
...
auto server_api = QSharedPointer<oapi::ClientToServerApi>::create();
server_api->setHttpConnectionSettings(utils::NetworkAddress
{
QHostAddress("127.0.0.1"),
3000
});
server_api->getTargetClasses(this, [] (RequestResult<oapi::TargetClasses> result,
http_client::NetworkReplyData /*network_reply_data*/)
{
if (result.error)
{
qDebug() << result.error->description;
return;
}
const oapi::TargetClasses& classes = *result.result;
for (const auto& c: classes.classes)
{
qDebug() << "Class: id" << c.first;
for (const auto& locale : c.second.locales)
{
qDebug() << "language id: " << locale.first
<< "text: " << locale.second;
}
qDebug() << "";
}
});
Subscription classes <name>Subscription
generated for all api requests:
Polling (periodical) subscription example:
auto subscription =
QSharedPointer<oapi::GetSourcesInfoSubscription>(new oapi::GetSourcesInfoSubscription());
QObject::connect(subscription.get(),
&oapi::GetSourcesInfoSubscription::onResponse,
this,
[s = subscription.get()]()
{
if (s->error())
{
qDebug() << s->error()->description;
return;
}
const oapi::SourcesInfo& sources = *s->response();
qDebug() << oapi::toJsonString(sources);
});
const int kRequestTimeoutMs = 1000;
const int kRequestPeriodMs = 1000;
subscription->setRequest(
[api = server_api.get(), kRequestTimeoutMs](QObject* owner, auto on_response)
{ api->getSourcesInfo(std::nullopt, owner, on_response, kRequestTimeoutMs); },
kRequestPeriodMs);
Long polling subscription example (server reply only when data chanes):
auto subscription =
QSharedPointer<oapi::GetSourcesInfoSubscription>(new oapi::GetSourcesInfoSubscription());
QObject::connect(subscription.get(),
&oapi::GetSourcesInfoSubscription::onResponse,
[s = subscription.get()]()
{
if (s->error())
{
qDebug() << s->error()->description;
return;
}
const oapi::SourcesInfo& sources = *s->response();
qDebug() << oapi::toJsonString(sources);
});
subscription->setRequest(
[api = server_api.get(), s = subscription.get()](QObject* owner, auto on_response) {
api->getSourcesInfo(
s->lastModifiedTime(), owner, on_response, http_client::kLongPollTimeout);
});