Enot radar software API (2.5.0)

Download OpenAPI specification:Download

This page contains the API documentation for Enot radar software. HTTP 1.1 and Websocket protocols are used for interacting with software. Websockets used for streaming of continuous time-synchronized data like trajectories, radar data and video.

HTTP Protocol specifics

  • Content-length header are required for POST requests.
  • identity and gzip are supported for Content-enconding header

Long polling requests

Requests with optional modified_since parameter (like Get configuration) support long-polling, allowing to get reply only if data was changed since last request. That allows to reduce number of requests and size of data transmitted over the network.

  1. On first request client requesting instant response passing modified_since=null.
  2. Server returns full response and last modification time in eTag http response header.
  3. Next request performed with eTag from last response passed to modified_since parameter.
  4. Server keeps connection without response until response data has changes. After response client repeat step #3.

APIs

Each application implements one or several apis:

API Description Service
Client To Server API Client requests related to sources information, archive functionality, alarm zones and events radar_server
Source To Server API Internal requests used by sources enot_dsp, lynx_dsp, video_service
Streaming API Continuous data streaming using WebSockets radar_server
Configuration API Configuration requests radar_server, enot_dsp, lynx_dsp
Enot Source API Enot radar source requests enot_dsp
Video Source API Video source requests video_service

Qt C++ SDK

Features:

  • OpenAPI models as C++ structures.
  • API's requests as C++ methods.
  • Serialization / deserialization with JSON and CBOR formats, GZIP compression support.
  • Streaming API implementation.
  • Examples.

Most of SDK sources generated based on this schema with OpenAPI generator project using custom generation templates. Generators for other languages are available.

More detailed SDK description available here.

Client To Server

Requests to radar_server used by client applications

Get server UTC time.

Responses

Response samples

Content type
application/json
"2021-07-24T08:20:49.483Z"

Get possible target classes.

Responses

Response samples

Content type
application/json
{
  • "classes": {
    }
}

Delete or temporary disable source

Delete or temporary disable source

query Parameters
source_id
required
integer <int64> (Id)
Example: source_id=2
disable
required
boolean

False - deleting source from the server with all archive data. True - temporary disabling source. It will be enabled on next registration request.

Responses

Get source telemetry for some period of time.

query Parameters
source_id
required
integer <int64> (Id)
Example: source_id=2
time_from
required
string <date-time> (Time)
Example: time_from=2021-07-24T08:20:49.483Z

Response containing telemetry starting from this time.

time_to
required
string <date-time> (Time)
Example: time_to=2021-07-24T08:20:49.483Z

Response containing telemetry until this time

columns
required
string

List of columns to return (comma separated)

rows_limit
required
integer

Limit for response rows count

Responses

Response samples

Content type
application/json
{
  • "time": [
    ],
  • "columns": {
    }
}

Get information about all sources.

query Parameters
modified_since
string <date-time> (Time)
Example: modified_since=2021-07-24T08:20:49.483Z

If the parameter is used, the server responds only when the data changes (see long polling requests).

Responses

Response samples

Content type
application/json
{
  • "all_sources": [
    ],
  • "sources": [
    ]
}

Clear archive.

Responses

Get archive range.

Responses

Response samples

Content type
application/json
{
  • "start": "2021-07-24T08:20:49.483Z",
  • "end": "2021-07-24T08:20:49.483Z"
}

Get archive map.

query Parameters
time_from
string <date-time> (Time)
Example: time_from=2021-07-24T08:20:49.483Z

Response containing archive map starting from this time.

time_to
string <date-time> (Time)
Example: time_to=2021-07-24T08:20:49.483Z

Response containing archive map until this time

Responses

Response samples

Content type
application/json
{
  • "server": {
    },
  • "sources": [
    ]
}

Get video frame.

query Parameters
source_id
required
integer <int64> (Id)
Example: source_id=2
time
required
string <date-time> (Time)
Example: time=2021-07-24T08:20:49.483Z

Responses

Get part of video frame with detection rect.

query Parameters
detection_id
required
integer <int64> (Id)
Example: detection_id=2
margin
required
integer

Margin in pixels from detection rect for output image.

Responses

Get all alarm zones.

query Parameters
modified_since
string <date-time> (Time)
Example: modified_since=2021-07-24T08:20:49.483Z

If the parameter is used, the server responds only when the data changes (see long polling requests).

Responses

Response samples

Content type
application/json
{
  • "zones": {
    }
}

Get alarm zone.

query Parameters
zone_id
required
integer <int64> (Id)
Example: zone_id=2

Responses

Response samples

Content type
application/json
{
  • "id": 2,
  • "name": "string",
  • "enabled": true,
  • "area": {
    },
  • "camera_aim_rules": {
    },
  • "jammer_rules": {
    },
  • "alarm_rules": {
    }
}

Create or update alarm zone.

Request Body schema: application/json
required
id
integer <int64> (Id)
name
required
string
enabled
required
boolean
required
object (GeoArea)
required
object (CameraAimRules)
required
object (JammerRules)
required
object (AlarmRules)

Responses

Request samples

Content type
application/json
{
  • "id": 2,
  • "name": "string",
  • "enabled": true,
  • "area": {
    },
  • "camera_aim_rules": {
    },
  • "jammer_rules": {
    },
  • "alarm_rules": {
    }
}

Delete alarm zone.

query Parameters
zone_id
required
integer <int64> (Id)
Example: zone_id=2

Responses

Get all alarm events.

query Parameters
modified_since
string <date-time> (Time)
Example: modified_since=2021-07-24T08:20:49.483Z

If the parameter is used, the server responds only when the data changes (see long polling requests).

time_from
string <date-time> (Time)
Example: time_from=2021-07-24T08:20:49.483Z

Response containing events starting from this time.

time_to
string <date-time> (Time)
Example: time_to=2021-07-24T08:20:49.483Z

Response containing events until this time

Responses

Response samples

Content type
application/json
{
  • "events": {
    }
}

Delete alarm event.

query Parameters
event_id
required
integer <int64> (Id)
Example: event_id=2

Responses

Set decision for alarm event.

Request Body schema: application/json
required
event_id
required
integer <int64> (Id)
type
required
string (AlarmEventDecisionType)
Enum: "friend" "stranger" "threat" "false_alarm"
comment
required
string

Responses

Request samples

Content type
application/json
{
  • "event_id": 2,
  • "type": "friend",
  • "comment": "string"
}

Close alarm event.

query Parameters
event_id
required
integer <int64> (Id)
Example: event_id=2

Responses

Get alarm event tracks.

query Parameters
modified_since
string <date-time> (Time)
Example: modified_since=2021-07-24T08:20:49.483Z
event_id
integer <int64> (Id)
Example: event_id=2
time_from
string <date-time> (Time)
Example: time_from=2021-07-24T08:20:49.483Z

Response containing events starting from this time.

time_to
string <date-time> (Time)
Example: time_to=2021-07-24T08:20:49.483Z

Response containing events until this time

Responses

Response samples

Content type
application/json
{
  • "event_tracks": {
    }
}

Source To Server

Requests to radar_server used by sources

Get last track id for source

query Parameters
source_id
required
integer <int64> (Id)
Example: source_id=2

Responses

Response samples

Content type
application/json
2

Get server UTC time.

Responses

Response samples

Content type
application/json
"2021-07-24T08:20:49.483Z"

Append possible target classes.

Request Body schema: application/json
required
required
object

Responses

Request samples

Content type
application/json
{
  • "classes": {
    }
}

Register source on the server

Source must be registered with generated UUID before sending other requests.

Request Body schema: application/json
required
uuid
required
string (UUID)

Unique 128-bit resource identificator

name
required
string (SourceName)
type
required
string (SourceType)
Enum: "enot_radar" "camera" "lynx_bearing" "dji" "skyhunter" "kuntsevo_radar"
address
required
string (IPV4Address)

Responses

Request samples

Content type
application/json
{
  • "uuid": "a25009df-b197-420d-8824-546c80e006ba",
  • "name": "Roof radar",
  • "type": "enot_radar",
  • "address": "127.0.0.1"
}

Response samples

Content type
application/json
2

Configuration

Configuration API provides flexible way of reading telemetry and reading/setting configuration. Format contains all information (item type, text translations, enum values, attributes) needed to generate configuration UI. List of supported configuration groups and properties depends on service type:

Get configuration.

query Parameters
modified_since
string <date-time> (Time)
Example: modified_since=2021-07-24T08:20:49.483Z

If the parameter is used, the server responds only when the data changes (see long polling request schema).

Responses

Response samples

Content type
application/json
{
  • "all_groups": [
    ],
  • "groups": [
    ]
}

Set configuration.

Validates and set configuration. Operation cancelled if validation of any property failed.

Request Body schema: application/json
required

Configuration values.

required
object

Key contains group id

Responses

Request samples

Content type
application/json
{
  • "groups": {
    }
}

Radar server

Configuration groups for radar_server:

string (LogParameters)
Enum: "enabled_categories" "infos_enabled" "warnings_enabled" "errors_enabled"
"enabled_categories"
string (ServerArchiveParameters)
Enum: "enabled" "storage_depth_hours" "min_free_space_gb"
"enabled"
string (ServerCommonParameters)
Enum: "data_directory" "tracks_merger"
"data_directory"

Merger configuration:

Configuration is used in generating of trajectory merger frames. Value of tracks_merger is a JSON string of MergerConfiguration format:

merged_sources
required
Array of integers (Id) unique [ items <int64 > unique [ items <int64 > ] ]

Subsets of sources which tracks can be merged in one group.

bearing_intersections_merger_distance
required
number <float>

Merge distance for two bearing intersection areas.

point_merger_distance
required
number <float>

Merge distance for tracks with known coordiantes.

{
  • "merged_sources": [
    ],
  • "bearing_intersections_merger_distance": 0.1,
  • "point_merger_distance": 0.1
}

Enot source

Configuration groups for enot_dsp.

string (LogParameters)
Enum: "enabled_categories" "infos_enabled" "warnings_enabled" "errors_enabled"
"enabled_categories"
string (SourceGeneralParameters)
Enum: "source_name" "server_address" "server_port" "host_address"
"source_name"
string (SourceDumpParameters)
Enum: "write_dump" "directory" "cache_size_mb" "write_speed" "cache" "write_time" "file_size"
"write_dump"
string (SourceLocationParameters)
Enum: "latitude" "longitude" "altitude" "azimuth"
"latitude"
string (AdapterRestartParameters)
Enum: "adapter_id" "loss_history_length" "coherent_loss_threshold" "restart_period" "current_coherent_loss"
"adapter_id"
string (NetworkConnectionParameters)
Enum: "ip" "port" "retranslation" "retranslation_ip" "retranslation_port" "protocol" "input_stream_mbit"
"ip"
string (EnotSegmentationParameters)
Enum: "range_spreading_losts" "max_power_difference" "union_range" "union_azimuth" "union_frequency" "min_points_count_1" "min_points_count_2" "min_points_count_3" "min_points_count_4" "filter_after_segmentation_1" "filter_after_segmentation_2" "filter_after_segmentation_3" "filter_after_segmentation_4" "use_noise_calibration" "noise_calibration" "rcs_power_calculation_type"
"range_spreading_losts"
string (EnotRangeProfilesParameters)
Enum: "count" "start_2" "start_3" "start_4"
"count"
string (EnotVisualizationParameters)
Enum: "detection_point_power_min" "detection_point_power_max" "map_power_min" "map_power_max" "map_noise_min" "map_noise_max" "min_power_for_elevation" "spectrum_stream_enabled" "spectrum_stream_azimuth" "spectrum_stream_cache_size" "additional_target_spectrum_data" "target_spectrums_range_margin" "target_spectrums_for_all_azimuth"
"detection_point_power_min"
string (EnotClutterFilterParameters)
Enum: "enabled" "union_range" "union_frequency" "min_filtered_range"
"enabled"
string (EnotClutterMapParameters)
Enum: "history_size" "filter_percent" "union_range" "union_azimuth" "union_frequency" "extension_range" "extension_azimuth" "extension_frequency"
"history_size"
string (EnotTrackingParameters)
Enum: "lifetime_azimuth_1" "lifetime_azimuth_2" "lifetime_azimuth_3" "lifetime_azimuth_4" "delay_azimuth" "extrapolate" "position_penalty_coeff" "position_weakening_track_length" "position_weakening_1" "position_weakening_2" "position_weakening_3" "position_weakening_4" "doppler_speed_penalty_coeff" "track_radial_speed_penalty_coeff" "track_tangential_speed_penalty_coeff" "rcs_penalty_coeff" "max_penalty_1" "max_penalty_2" "max_penalty_3" "max_penalty_4" "kalman_coeff" "tracking_info_size" "track_rcs_history_length" "track_rcs_calculation_type"
"lifetime_azimuth_1"
string (EnotElevationParameters)
Enum: "enabled" "phase_shift" "phase_autofocus" "antenna_tilt"
"enabled"
string (EnotCompensationParameters)
Enum: "enabled" "threshold"
"enabled"
string (TerrainProviderParameters)
Enum: "surface_config" "surface_loaded" "altitude_config" "altitude_loaded"
"surface_config"

Value of surface_config is a JSON string of TerrainSurfaceConfig format:

url
required
string

Schematic surface tiles server with [x], [y] and [z] substrings.

level
required
integer

Zoom level.

water_hue_min
required
integer

Min water color hue.

water_hue_max
required
integer

Max water color hue.

{}

Value of altitude_config is a JSON string of TerrainAltitudeConfig format:

url
required
string

Altitude tiles server with [x], [y] and [z] substrings. Altitude must be in mapbox format.

level
required
integer

Zoom level.

string (EnotClassificationParameters)
Enum: "enabled" "use_terrain_altitude" "max_ground_altitude" "enable_unknown_class" "enable_boat_class" "enable_plane_class" "max_noise_rcs" "max_birds_rcs" "max_drone_rcs" "max_human_rcs" "max_human_speed"
"enabled"
string (EnotReliableTracksParameters)
Enum: "min_track_length" "all_classes"
"min_track_length"
string (EnotDspParameters)
Enum: "distance_between_antennas" "vertical_antenna_pattern" "horizontal_antenna_pattern" "mean_compensation" "coherent_size" "central_band" "overlap" "training_freq" "guard_freq" "training_range" "guard_range" "cfar_coefficient_1" "cfar_coefficient_2" "cfar_coefficient_3" "cfar_coefficient_4" "processing_start_range" "processing_end_range" "processing_start_azimuth" "processing_end_azimuth" "adc_limit_using" "coherent_loss_filter" "zones_filter"
"distance_between_antennas"

Value of zones_filter is a JSON string of FilterAreas format:

required
Array of objects (FilterArea)
{
  • "areas": [
    ]
}
string (EnotTelemetryParameters)
Enum: "processing_overflow" "time_since_processing_overflow" "matched_queue_size" "coherent_loss" "angle_sensor" "azimuth" "rotation_speed_from_data" "max_overflow_range" "radar_time_str" "radar_time_sec" "radar_date_time" "time_delta_sec" "time_delta_str" "time_since_last_sync_str" "time_since_last_sync_sec" "dsp_time_str" "dsp_time_sec" "cfar_points_total" "cfar_points_after_filter" "cfar_filter_percent" "clutter_cells_percent" "clutter_speeds_percent" "active_tracks" "elevation_error"
"processing_overflow"
string (EnotSystemSpecifications)
Enum: "frames_per_second" "pulses_per_second" "data_rate" "pulse_time" "pulse_period" "pulses_per_interval" "duty_cycle" "interval_energy_coeff" "sampling_freq" "processing_bandwidth" "pulse_spectrum_width" "pulse_sampling_start_freq" "pulse_sampling_end_freq" "pulse_sampling_max_freq" "pulse_int_avg_freq" "pulse_avg_freq" "rotation_speed" "min_range" "receiver_min_range" "max_range" "range_resolution" "weighted_range_resolution" "range_discrete" "unambiguity_range" "velocity_resolution" "velocity_min" "velocity_max" "optimal_guard_range" "optimal_guard_freq" "optimal_min_points_count"
"frames_per_second"
string (BfosParserParameters)
Enum: "ignore_errors" "fdds" "fadc" "ffpga" "ffrrw" "heterodin" "thinning" "thinning_2" "signal_window" "model_window" "coherent_window" "max_parsed_channels" "max_frame_loss" "fps" "frame_loss"
"ignore_errors"
string (BfosConnectionParameters)
Enum: "send_ip" "send_port" "recv_ip" "recv_port" "timeout" "connected"
"send_ip"
string (BfosRetranslationParameters)
Enum: "enabled" "send_ip" "send_port" "recv_ip" "recv_port"
"enabled"
string (BfosControlParameters)
Enum: "transceiver" "rotator_speed" "lmk_reset" "adc_overflow_format"
"transceiver"
string (BfosTelemetryParameters)
Enum: "serial_number" "firmware_version" "motor_state" "azimuth" "time_str" "time_sec"
"serial_number"
string (BfosDisabledSectorsParameters)
Enum: "disable_sector_1" "start_sector_1" "end_sector_1" "disable_sector_2" "start_sector_2" "end_sector_2" "disable_sector_3" "start_sector_3" "end_sector_3" "disable_sector_4" "start_sector_4" "end_sector_4"
"disable_sector_1"
string (TransceiverParameters)
Enum: "heterodin_start" "heterodin" "if_tx" "rf_tx" "if_rx_1" "rf_rx_1" "if_rx_2" "rf_rx_2" "if_rx_3" "rf_rx_3" "if_rx_4" "rf_rx_4" "tx_mdm_strobe" "rx_mdm_strobe" "tx_pa_strobe" "power_sensor_strobe_delay"
"heterodin_start"
string (TransceiverTelemetryParameters)
Enum: "transceiver_state" "duty_cycle_error" "dld_state" "pll_state" "mdm_current" "mdm_temperature" "mdm_power" "pa_current" "pa_temperature" "pa_power"
"transceiver_state"
string (EnotErrorRulesParameters)
Enum: "startup_state" "angle_sensor" "rotation_speed" "max_loss" "max_coherent_loss" "terrain_tiles"
"startup_state"
string (BfosErrorRulesParameters)
Enum: "min_mdm_power" "min_pa_power" "errors" "error_on_suboptimal_transceiver_parameters"
"min_mdm_power"
string (StartupManagerParameters)
Enum: "state" "noise_deviation" "signal_deviation" "doppler_power" "start" "autostart" "startup_preset" "required_rotator_speed" "diagnostics" "inversed_rotator_start" "max_noise_deviation" "max_signal_deviation" "max_doppler_power"
"state"

Lynx source

Configuration groups for lynx_dsp.

string (LogParameters)
Enum: "enabled_categories" "infos_enabled" "warnings_enabled" "errors_enabled"
"enabled_categories"
string (SourceGeneralParameters)
Enum: "source_name" "server_address" "server_port" "host_address"
"source_name"
string (SourceDumpParameters)
Enum: "write_dump" "directory" "cache_size_mb" "write_speed" "cache" "write_time" "file_size"
"write_dump"
string (SourceLocationParameters)
Enum: "latitude" "longitude" "altitude" "azimuth"
"latitude"
string (AdapterRestartParameters)
Enum: "adapter_id" "loss_history_length" "coherent_loss_threshold" "restart_period" "current_coherent_loss"
"adapter_id"
string (NetworkConnectionParameters)
Enum: "ip" "port" "retranslation" "retranslation_ip" "retranslation_port" "protocol" "input_stream_mbit"
"ip"
string (LynxDspParameters)
Enum: "ffadc" "ffpga" "frequency_start" "interval_size" "overlap" "training_freq" "guard_freq" "error_probability" "error_probability_accumulated" "coherent_loss_filter" "phase_shift" "channels_mode"
"ffadc"
string (LynxSegmentationParameters)
Enum: "max_object_azimuth_width" "min_detection_counts" "union_azimuth" "union_frequency" "max_azimuth_rms" "only_accumulated_detections" "antennas_distance"
"max_object_azimuth_width"
string (LynxTrackingParameters)
Enum: "track_lifetime" "union_azimuth" "union_frequency"
"track_lifetime"
string (LynxFakeTargetsGenerationParameters)
Enum: "enabled" "azimuth" "azimuth_offset" "sector" "sector_offset"
"enabled"
string (LynxClutterParameters)
Enum: "history" "filter_percent" "union_azimuth" "union_frequency" "extension_azimuth" "extension_frequency"
"history"
string (LynxTelemetryParameters)
Enum: "processing_overflow" "time_since_processing_overflow" "detection_queue_size" "interval_loss" "device_time_str" "device_time_sec" "device_date_time" "time_delta_sec" "time_delta_str" "time_since_last_sync_str" "time_since_last_sync_sec" "azimuth_error"
"processing_overflow"
string (LynxErrorRulesParameters)
Value: "max_interval_loss"
"max_interval_loss"

Enot Source

Requests for enot_dsp

Write parameters to radar onboard EPROM

Responses

Video Source

Requests for video_service

Force PTZ to follow the trajectory. Stops following if argument is not set.

Request Body schema: application/json
optional
source_id
required
integer <int64> (Id)
track_id
required
integer <int64>

Trajectory id (unique for each source)

Responses

Request samples

Content type
application/json
{
  • "source_id": 2,
  • "track_id": 2
}

Get state

Responses

Response samples

Content type
application/json
{
  • "coords": {
    },
  • "angles": {
    },
  • "geo_angles": {
    },
  • "first_fov": {
    },
  • "second_fov": {
    },
  • "focus": 1,
  • "is_moving": true,
  • "is_locked": true,
  • "jammer_state": true,
  • "autojammer_state": true,
  • "jamming_time": "2019-08-24T14:15:22Z",
  • "glass_wipper_state": true
}

Get limits

Responses

Response samples

Content type
application/json
{
  • "tilt": {
    },
  • "h_fov": {
    },
  • "max_speed": {
    }
}

Unlock ptz after user action, so aim algorithms could take control.

Responses

Continuous move with normalized speeds.

Request Body schema: application/json
required
pan
required
number [ -1 .. 1 ]
tilt
required
number [ -1 .. 1 ]
zoom
required
number [ -1 .. 1 ]

Responses

Request samples

Content type
application/json
{
  • "pan": -1,
  • "tilt": -1,
  • "zoom": -1
}

Continuous move with angular speeds.

Request Body schema: application/json
required
pan
required
number

Pan angular speed (degrees/sec).

tilt
required
number

Tilt angular speed (degrees/sec).

zoom
required
number

Zoom angular speed (degrees/sec).

Responses

Request samples

Content type
application/json
{
  • "pan": 0,
  • "tilt": 0,
  • "zoom": 0
}

Absolute move with normalized coordinates and speeds.

Request Body schema: application/json
required
object (PTCoords)

Normalized PT coordinates.

zoom
number [ 0 .. 1 ]
required
object (PTZSpeeds)

Normalized PTZ speeds.

Responses

Request samples

Content type
application/json
{
  • "coords": {
    },
  • "zoom": 1,
  • "speeds": {
    }
}

Absolute move to angles with angular speed.

Request Body schema: application/json
required
object (PTAngles)
fov
number
required
object (PTZAngularSpeeds)

Responses

Request samples

Content type
application/json
{
  • "angles": {
    },
  • "fov": 0,
  • "speeds": {
    }
}

Stop PTZ move.

Responses

Move to geo coordinates

Request Body schema: application/json
required
required
object (GeoPosition)
target_diameter
required
number

Diameter of target sphere (meters). Zoom is chosen so that the target occupies the entire horizontal field of view.

fov_scale
number

Optional scale for FOV.

required
object (PTZAngularSpeeds)

Responses

Request samples

Content type
application/json
{
  • "target_position": {
    },
  • "target_diameter": 0,
  • "fov_scale": 0,
  • "speeds": {
    }
}

Move to home position.

Responses

Move to position on video frame

Move camera, so its center will be looking on certain point of a video frame

Request Body schema: application/json
required
required
object (Point)
second_video
boolean

True for moving relative to second video source

Responses

Request samples

Content type
application/json
{
  • "position": {
    },
  • "second_video": true
}

Continuous focus move.

Request Body schema: application/json
required
speed
required
number [ -1 .. 1 ]

Responses

Request samples

Content type
application/json
{
  • "speed": -1
}

Absolute focus move.

Request Body schema: application/json
required
focus
required
number [ 0 .. 1 ]
speed
required
number [ -1 .. 1 ]

Responses

Request samples

Content type
application/json
{
  • "focus": 1,
  • "speed": -1
}

Stop focus move.

Responses

Do auto focus once.

Responses

Enable/disable jammer.

Request Body schema: application/json
required
enable
required
boolean

Responses

Request samples

Content type
application/json
{
  • "enable": true
}

Enable/disable autojammer.

Request Body schema: application/json
required
enable
required
boolean

Responses

Request samples

Content type
application/json
{
  • "enable": true
}

Enable/disable glass wipper.

Request Body schema: application/json
required
enable
required
boolean

Responses

Request samples

Content type
application/json
{
  • "enable": true
}

Move camera to next target immediately.

Responses

Move camera to previous target immediately, if it's alive.

Responses

Streaming API

Websockets used for transmitting continuous data streams. Each connection can transmit several stream types of one source. Client can request realtime or archive data. All streams inside one connection are synchronized by time.

Each websocket message received from the server, or sent by client contains:

  • Header: first 12 bytes.
  • Serialized frame body: other bytes.

Header structure:

  • byte 0: Stream type.
  • byte 1: Frame type.
  • byte 2: Serialization format
  • byte 3: If 1, frame body is compressed with GZIP.
  • bytes 4-11: Frame creation datetime as the number of milliseconds that have passed since 1970-01-01T00:00:00, UTC (signed 64-bit integer).

Serialization format:

Supported frame serialization formats:

Format Value Encoding details
json 0 Text serialization format.
  • Enum filelds encoded as strings.
  • Binary data encoded using Base64.
cbor 1 Binary serialization format based on JSON. [Overview] [Implementations].
  • Enum fields encoded as signed 64-bit integer.
  • Date and date-time can be NULL (0xf6).

It is highly recommended not to use the JSON format for sent and received frames, because:

  • It forces server to transcode frames.
  • Has a lower serialization / deserialization speed.
  • Greatly increases the amount of data transmitted over the network, especially for binary data.

Anyway, JSON format is available for debug / fast integration purposes.

Frame type:

Message header byte, telling about type of frame (format or data) and data type that message body contains.

Frame Type Description
0 Format frame (optional for some streams). Client must save and use format to decode/visualize data frames. Sent by the server when:
  • Client initializes subscription to the stream.
  • Format was chaned.
>= 1 Data frames. Possible frame types depends on stream type.

Stream type:

CBOR / header value JSON value Source type Header frame type and related message body types
0 stream_config Any
Frame typeType
1ReadStreamRequests
ReadStreamPlaybackState
1 trajectories enot_radar, camera, lynx_bearing, dji, skyhunter
Frame typeType
1Trajectory changes frame
2Trajectory interpolation frame
3Trajectory metadata frame
4Trajectory merger frame
2 video camera
Frame typeType
0VideoFormat
1VideoPacket
3 video_metadata camera
Frame typeType
1VideoMetadataFrame
4 enot_detection_points enot_radar
Frame typeType
1EnotDetectionPointsFrame
5 enot_intensity_map enot_radar
Frame typeType
0EnotIntensityMapFormat
1EnotIntensityMapFrame
6 enot_clutter_map enot_radar
Frame typeType
0EnotClutterMapFormat
1EnotClutterMapFrame
7 enot_spectrum_map enot_radar
Frame typeType
0EnotSpectrumMapFormat
1EnotSpectrumMapFrame
8 enot_spectrum_data enot_radar
Frame typeType
1EnotSpectrumFrame
9 lynx_spectrum lynx_bearing
Frame typeType
1LynxSpectrumFrame
10 second_video camera For second camera head (infared).
Frame typeType
0VideoFormat
1VideoPacket
11 second_video_metadata camera For second camera head (infared).
Frame typeType
1VideoMetadataFrame
12 enot_noise_map enot_radar
Frame typeType
0EnotIntensityMapFormat
1EnotIntensityMapFrame
13 video_analyzer camera Video analyzer results.
Frame typeType
1VideoAnalyzerFrame
14 second_video_analyzer camera Second camera head (infared) analyzer results.
Frame typeType
1VideoAnalyzerFrame

Subscription to streams:

  • Open websocket connection to "ws://'server_ip':'server_http_port'+1".
  • Make read requests to configure subscription and playback. Playback parameters can be changed independently any time (for example, playback time can be changed without changing streams and speed).
  • Deserialize incoming messages using header information. Server sends playback state messages and frames of requested streams.

Read stream requests:

object (ReadConfig)

Used by client for initializing WS connection for stream reading.

Array of objects (ReadStreamConfig) unique

Configuration of reading streams

seek_to
string <date-time>

Playback seek time. Pass null for switch to realtime.

speed
number >= 0

Playback speed. Passing 0 causes stream to pause.

{
  • "read_config": {
    },
  • "read_streams": [
    ],
  • "seek_to": "2021-07-24T08:20:49.483Z",
  • "speed": 0
}

Read stream playback state:

end_of_export
string
Enum: "stream_config" "trajectories" "video" "video_metadata" "enot_detection_points" "enot_intensity_map" "enot_clutter_map" "enot_spectrum_map" "enot_spectrum_data" "lynx_spectrum" "second_video" "second_video_metadata" "enot_noise_map" "video_analyzer" "second_video_analyzer"

Indicates that the export of the specified stream has completed. See export_depth in ReadStreamConfig.

end_of_chunk
string
Enum: "stream_config" "trajectories" "video" "video_metadata" "enot_detection_points" "enot_intensity_map" "enot_clutter_map" "enot_spectrum_map" "enot_spectrum_data" "lynx_spectrum" "second_video" "second_video_metadata" "enot_noise_map" "video_analyzer" "second_video_analyzer"

Indicates that playback of the specified stream has reached the end of archive chunk.

{
  • "end_of_export": "stream_config",
  • "end_of_chunk": "stream_config"
}

Enot radar detection points stream

Enot detection points frame:

azimuth
required
number <float> [ 0 .. 360 ]

Current antenna azimuth (degrees).

required
Array of objects (EnotDetectionPoint) >= 0 items
{
  • "azimuth": 360,
  • "points": [
    ]
}

Enot radar intensity and noise map streams

Enot intensity map format:

range_step
required
number <float>

Size of one ray element (meters)

start_range
required
number <float>

First ray element range (meters)

range_size
required
integer <int32>

Number of ray elements

azimuth_size
required
integer <int32>

Number of azimutal elements

{
  • "range_step": 0.1,
  • "start_range": 0.1,
  • "range_size": 0,
  • "azimuth_size": 0
}

Enot intensity map frame:

azimuth
required
integer <int32>

Current ray azimuth [ 0 .. EnotIntensityMapFormat.azimuth_size ]

ray_data
required
string <binary>

Buffer, contiaing EnotIntensityMapFormat.range_size elements.

Each unsigned byte encodes coloring mode and intensity. Values 0..192 are colored intensity and the rest 192..255 are grayscaled intensity.

//GLSL code for intensity and coloring mode decompression
//'value' is normalized float (0..1)
bool coloring = value < (192.0 / 256.0);
if (coloring)
    intensity = (value * 256.0 / 192.0);
else
    intensity = ((value * 256.0 - 192.0) / 64.0);
{
  • "azimuth": 0,
  • "ray_data": "string"
}

Enot radar clutter map stream

Enot clutter map format:

azimuth_size
required
integer <int32>

Number of azimutal elements

range_size
required
integer <int32>

Number of ray elements

range_step
required
number <float>

Size of one ray element (meters)

start_range
required
number <float>

First ray element range (meters)

speed_size
required
integer <int32>

Number of speed elements

speed_step
required
number <float>

Size of one speed element (msec)

threshold
required
integer <int32_t>

Number of detections needed to filter out cell.

{
  • "azimuth_size": 0,
  • "range_size": 0,
  • "range_step": 0.1,
  • "start_range": 0.1,
  • "speed_size": 0,
  • "speed_step": 0.1,
  • "threshold": 0
}

Enot clutter map frame:

azimuth
required
integer <int32>

Current ray azimuth [ 0 .. EnotClutterMapFormat.azimuth_size ]

mask
required
string <binary>

Buffer size is EnotClutterMapFormat.range_size * EnotClutterMapFormat.speed_size. Each byte contains EnotClutterMask for specific range and speed.

detections
required
string <binary>

Buffer size is EnotClutterMapFormat.range_size * EnotClutterMapFormat.speed_size. Each byte contains total detections count for specific range and speed.

{
  • "azimuth": 0,
  • "mask": "string",
  • "detections": "string"
}

Enot clutter mask:

string (EnotClutterMask)
Enum: "empty" "extended_clutter" "clutter"
"empty"

Enot radar spectrum map stream

Enot spectrum map format:

start_range
required
number <float>

Start range (meters)

end_range
required
number <float>

End range (meters)

azimuth_size
required
integer <int32>

Number of azimutal elements

{
  • "start_range": 0.1,
  • "end_range": 0.1,
  • "azimuth_size": 0
}

Enot spectrum map frame:

azimuths
required
string <binary>

Buffer size is EnotSpectrumMapFormat.azimuth_size. Each unsigned byte contains intensity of specific azimuth [ 0 .. 255 ].

{
  • "azimuths": "string"
}

Enot radar spectrum data stream

Enot spectrum frame:

required
object (EnotSpectrumFormat)
min_incremental_azimuth
required
integer <int64>
max_incremental_azimuth
required
integer <int64>
power
required
Array of numbers <float> [ items <float > ]

Power (DB).

cfar_threshold
Array of numbers or null <float> [ items <float > ]

Cfar threshold (DB).

compensation_power
Array of numbers <float> [ items <float > ]

Compensation channel power (DB).

altitude
Array of numbers <float> [ items <float > ]

Altitude (meters).

mask
required
string <binary>

Buffer size is EnotSpectrumFormat.range_size * EnotSpectrumFormat.speed_size. Each byte contains EnotSpectrumMask for specific range and speed.

{
  • "format": {
    },
  • "min_incremental_azimuth": 0,
  • "max_incremental_azimuth": 0,
  • "power": [
    ],
  • "cfar_threshold": [
    ],
  • "compensation_power": [
    ],
  • "altitude": [
    ],
  • "mask": "string"
}

Enot spectrum mask:

8-bit mask, containing detection and filtering algorithms flags.

bit description
0 Analog to digital converter overflow. False or missing CFAR detections possible.
1 Is point passed CFAR detection filter. The meaning of the remaining bits depends on this value.

If CFAR detection filter not passed (bit 1 is 0):

bit description
2 Point removed by central band filter.
3 Point removed by central band power filter.

If CFAR detection filter passed (bit 1 is 1):

bit description
2 Point filtered by compensaton channel filter.
3 Point filtered by clutter filter.
4 Point filtered by clutter map filter.
5 Point filtered by zones filter.
6 Point became part of the target.
7 Point became most intense part of the target.

Lynx spectrum data stream

Lynx spectrum frame:

frequency_start
required
integer

Begin of device frequency band (MHz)

frequency_end
required
integer

End of device frequency band (MHz)

power
required
Array of numbers <float> [ items <float > ]

Power for each frequency (DB).

threshold
required
Array of numbers <float> [ items <float > ]

Threshold for each frequency (DB).

mask
required
string <binary>

One byte with LynxSpectrumMask for each frequency.

{
  • "frequency_start": 0,
  • "frequency_end": 0,
  • "power": [
    ],
  • "threshold": [
    ],
  • "mask": "string"
}

Enot spectrum mask:

8-bit mask, containing detection and filtering algorithms flags.

bit description
0 Analog to digital converter overflow. False or missing CFAR detections possible.
1 Is point passed CFAR detection filter.
2 Point is removed by clutter filter.
3 Point is removed by interval filter.
4 Point became part of the target.
5 Point became most intense part of the target.

Trajectories stream

Client should store trajectory data and append/update/remove it when new stream frames received.

Trajectory changes frame:

time
required
string <date-time>

Changes time.

required
Array of objects (TrajectoryPoint)

New/appended trajectory points.

removed_tracks
required
Array of integers <int64> (Id) unique [ items <int64 > ]

Array of removed trajectories. There will be no more updates of these trajectories.

remove_all
boolean

If true, indicating that there will be no more updates for any previous tracks.

{
  • "time": "2021-07-24T08:20:49.483Z",
  • "new_points": [
    ],
  • "removed_tracks": [
    ],
  • "remove_all": true
}

Trajectory interpolation frame:

Interpolated positions, related to specific trajectory point. Interpolated positions appended or replaced when new points added to trajectory.

required
Array of objects non-empty
{
  • "interpolation": [
    ]
}

Trajectory metadata frame:

Additional information, related to specific trajectory point. Generated by the server, can arrive with delay after trajectory point.

required
object (TrajectoryPointId)
alarm_events
Array of integers <int64> (Id) unique [ items <int64 > ]
video_detections
Array of integers <int64> (Id) unique [ items <int64 > ]
{
  • "point_id": {
    },
  • "alarm_events": [
    ],
  • "video_detections": [
    ]
}

Trajectory merger frame:

Tracks from different sources caused by one real target are placed into one group according to merger configuration. Client can display only main track in each group, to avoid trajectories duplication. Merger frame also includes geometry information about Lynx bearing intersections. Merger frames are generated by the server. To get them client must subscribe radar_server trajectories stream by passing empty source_id in ReadConfig when initializing reading stream.

required
Array of objects (TrajectoryMergerGroup) non-empty unique

New/updated merger groups.

removed_groups
required
Array of integers <int64> (Id) unique [ items <int64 > ]

Array of removed merger groups. There will be no more updates of these groups.

remove_all
required
boolean

If true, indicating that there will be no more updates for any previous groups.

required
Array of objects (BearingMergerMetadata)
{
  • "new_groups": [
    ],
  • "removed_groups": [
    ],
  • "remove_all": true,
  • "bearing_metadata": [
    ]
}

Video stream

Video stream can be decoded using ffmpeg library.

Video format:

codec_id
required
integer <int32>
width
required
integer <int32>
height
required
integer <int32>
data
required
string <binary>

Contains AVCodecParameters::extradata

{
  • "codec_id": 0,
  • "width": 0,
  • "height": 0,
  • "data": "string"
}

Equals to AVCodecParameters in ffmpeg library. Example of codec initialization from a VideoFormat structure:

std::optional<Codec> AvCodecFromVideoFormat(const oapi::VideoFormat& video_format)
{
    AVCodecParameters* av_codec_params_ptr = avcodec_parameters_alloc();
    auto& av_codec_params = *av_codec_params_ptr;

    av_codec_params.codec_type = AVMediaType::AVMEDIA_TYPE_VIDEO;
    av_codec_params.codec_id = static_cast<AVCodecID>(video_format.codec_id);
    av_codec_params.width = video_format.width;
    av_codec_params.height = video_format.height;

    av_codec_params.format = AVPixelFormat::AV_PIX_FMT_YUV420P;

    av_codec_params.extradata = reinterpret_cast<uint8_t*>(
        av_malloc(video_format.data.size() + AV_INPUT_BUFFER_PADDING_SIZE));
    av_codec_params.extradata_size = video_format.data.size();
    memcpy(av_codec_params.extradata, video_format.data.data(), video_format.data.size());

    Codec codec;
    codec.av_codec = avcodec_find_decoder(av_codec_params.codec_id);
    if (!codec.av_codec)
    {
        avcodec_parameters_free(&av_codec_params_ptr);
        return std::nullopt;
    }

    codec.av_codec_ctx = avcodec_alloc_context3(codec.av_codec);
    if (!codec.av_codec_ctx)
    {
        avcodec_parameters_free(&av_codec_params_ptr);
        return std::nullopt;
    }

    auto res = avcodec_parameters_to_context(codec.av_codec_ctx, av_codec_params_ptr);
    if (res < 0)
    {
        avcodec_parameters_free(&av_codec_params_ptr);
        avcodec_close(codec.av_codec_ctx);
        return std::nullopt;
    }

    // Open codec
    res = avcodec_open2(codec.av_codec_ctx, codec.av_codec, nullptr);
    if (res < 0)
    {
        avcodec_parameters_free(&av_codec_params_ptr);
        avcodec_close(codec.av_codec_ctx);
        return std::nullopt;
    }

    avcodec_parameters_free(&av_codec_params_ptr);

    return codec;
}

Video packet:

key_frame
required
boolean
time
required
string <date-time> (Time)
data
required
string <binary>
{
  • "key_frame": true,
  • "time": "2021-07-24T08:20:49.483Z",
  • "data": "string"
}

Equals to AVPacket in ffmpeg library. Example of packet initialization from a VideoPacket structure:

AVPacket AvPacketFromFrame(oapi::VideoPacket& packet)
{
    AVPacket av_packet;
    av_init_packet(&av_packet);
    av_packet.data = reinterpret_cast<uint8_t*>(packet.data.data());
    av_packet.size = packet.data.size();
    av_packet.flags = packet.key_frame ? AV_PKT_FLAG_KEY : 0;
    return av_packet;
}

Video Metadata Frame:

video_frame_time_utc_ms
required
integer <int64>

Time of related video frame

object (PTZState)
object (SourceTrajectoryPointId)

Trajectory point at which camera was aimed

{
  • "video_frame_time_utc_ms": 0,
  • "ptz_state": {
    },
  • "aimed_trajectory": {
    }
}

Video Analyzer Result:

state
string (VideoAnalyzerState)
Enum: "error" "disabled" "enabled"

Used only in VideoService analyzer

date
required
string <date-time>

Time of analyzed video frame

Array of objects (VideoAnalyzerObject)
object

Classification index for each model

object

Classification label for each model

object

Score prediction for each model

{
  • "state": "error",
  • "date": "2019-08-24T14:15:22Z",
  • "objs": [
    ],
  • "category_id": {
    },
  • "category": {
    },
  • "score": {
    }
}

SDK overview

Requirements

  • C++ 17
  • CMake 3.x
  • Qt 5.15
  • zlib

Structure

├───libs
│   ├───utils         - helper classes and functions
│   ├───http_types    - base HTTP types
│   ├───http_client   - HTTP client library
│   ├───oapi_models   - Generated OpenAPI structures, serialization / deserialization
│   ├───oapi_client   - Generated API's for client implementation
│   └───streaming_api - WebSockets stream, message header, frame creation and parsing
└───examples
    ├───1_single_request
    ├───2_polling_subscription
    ├───3_long_polling_subscription
    ├───4_source_configuration
    ├───5_reading_radar_streams
    ├───6_archive_export
    └───7_serialization_format_comparison

Code examples below based on SDK examples.

Models and serialization

Model:

Each model represented with oapi::<name> structure.

  • Non required fields represented with std::optional<T>
  • Arrays represented with std::vector
  • Arrays with unique items represented with std::set<T>
  • Maps represented with std::map<QString, T>
  • Enums represented with enum class
  • Serialization is avaialble for all OpenAPI models.

Serialization:

oapi::ReadStreamRequests request;
request.seek_to = QDateTime::currentDateTimeUtc().addSecs(-30);
request.speed = 0.5;

// default serialization
QByteArray cbor_gzip_buffer =
    streaming_api::Serialize(request,
                             Header(oapi::StreamType::E_STREAM_CONFIG,
                                    utils::to_underlying(oapi::CommonFrameType::E_DATA)));

// manually set time, format & compression
QByteArray json_buffer =
    streaming_api::Serialize(request,
                             Header(oapi::StreamType::E_STREAM_CONFIG,
                                    utils::to_underlying(oapi::CommonFrameType::E_DATA),
                                    QDateTime::currentDateTimeUtc().toMSecsSinceEpoch(),
                                    false,  // disable gzip compression
                                    oapi::SerializationFormat::E_JSON));

Deserialization:

QByteArray buffer = ...;
oapi::TrajectoryChangesFrame traj;
std::optional<Error> error = streaming_api::Deserialize(traj, buffer);
if (error)
{
    qDebug() << *error;
    return;
}
qDebug() << traj.time.toString(Qt::ISODateWithMs);

Streaming API

Stream subscription:

streaming_api::Stream class provides websocket stream with easy configuration and reconnect handling.

...

auto stream = new streaming_api::Stream(owner);

// Open connection
stream->open(streams_address);

// Set handler for incoming stream frames & playback state
QObject::connect(stream, &streaming_api::Stream::onMessage, owner, &ProcessMessage);

//Configure as reading stream with default format and compression
stream->initRead({source_id, std::nullopt, std::nullopt});

//Load 60 seconds of trajectories archive to restore scene before starting a playback
const int kExportDepthSecs = 60;

//Set requested streams & their configurations
stream->readStreams({{oapi::StreamType::E_TRAJECTORIES, kExportDepthSecs},
                     {oapi::StreamType::E_ENOT_INTENSITY_MAP, std::nullopt}});

//Configure playback
stream->playback()->setSpeed(0.5);
stream->playback()->seek(QDateTime::currentDateTimeUtc().addSecs(-60));
//for switch back to realtime: stream->playback()->seekToRealtime();

...

void ProcessMessage(const QByteArray& message)
{
    //Parse frame depending on StreamType and FrameType
    const streaming_api::Header* header = streaming_api::GetHeader(message);
    switch (header->stream_type)
    {
    case oapi::StreamType::E_TRAJECTORIES:
    {
        if (header->frame_type == utils::to_underlying(oapi::TrajectoryFrameType::E_CHANGES))
        {
            oapi::TrajectoryChangesFrame frame;
            auto error = streaming_api::Deserialize(frame, message);
            if (error)
            {
                qDebug() << "TrajectoryChangesFrame deserialization error:" << *error;
                return;
            }

            if (frame.new_points)
            {
                for (auto& p : *frame.new_points)
                {
                    qDebug() << "Trajectory" << p.id.track_id << "updated, position:" << toJson(p.position);
                }
            }
        }
        //handle other frame types
        ...
    }
    //handle other stream types
    ...

Requests and subscriptions

API class:

Each API represented with oapi::<name>API class:

#include <oapi_client/ClientToServerApi.h>

...
auto server_api = QSharedPointer<oapi::ClientToServerApi>::create();
server_api->setHttpConnectionSettings(utils::NetworkAddress
                              {
                                  QHostAddress("127.0.0.1"),
                                  3000
                              });

Calling single asynchronious request:

server_api->getTargetClasses(this, [] (RequestResult<oapi::TargetClasses> result,
                   http_client::NetworkReplyData /*network_reply_data*/)
{
    if (result.error)
    {
        qDebug() << result.error->description;
        return;
    }

    const oapi::TargetClasses& classes = *result.result;

    for (const auto& c: classes.classes)
    {
        qDebug() << "Class: id" << c.first;

        for (const auto& locale : c.second.locales)
        {
            qDebug() << "language id: " << locale.first
                      << "text: " << locale.second;
        }

        qDebug() << "";
    }
});

Subscriptions:

Subscription classes <name>Subscription generated for all api requests:

Polling (periodical) subscription example:

auto subscription =
    QSharedPointer<oapi::GetSourcesInfoSubscription>(new oapi::GetSourcesInfoSubscription());

QObject::connect(subscription.get(),
                 &oapi::GetSourcesInfoSubscription::onResponse,
                 this,
                 [s = subscription.get()]()
                 {
                     if (s->error())
                     {
                         qDebug() << s->error()->description;
                         return;
                     }

                     const oapi::SourcesInfo& sources = *s->response();
                     qDebug() << oapi::toJsonString(sources);
                 });

const int kRequestTimeoutMs = 1000;
const int kRequestPeriodMs = 1000;
subscription->setRequest(
    [api = server_api.get(), kRequestTimeoutMs](QObject* owner, auto on_response)
    { api->getSourcesInfo(std::nullopt, owner, on_response, kRequestTimeoutMs); },
    kRequestPeriodMs);

Long polling subscription example (server reply only when data chanes):

auto subscription =
    QSharedPointer<oapi::GetSourcesInfoSubscription>(new oapi::GetSourcesInfoSubscription());

QObject::connect(subscription.get(),
                 &oapi::GetSourcesInfoSubscription::onResponse,
                 [s = subscription.get()]()
                 {
                     if (s->error())
                     {
                         qDebug() << s->error()->description;
                         return;
                     }

                     const oapi::SourcesInfo& sources = *s->response();
                     qDebug() << oapi::toJsonString(sources);
                 });

subscription->setRequest(
    [api = server_api.get(), s = subscription.get()](QObject* owner, auto on_response) {
        api->getSourcesInfo(
            s->lastModifiedTime(), owner, on_response, http_client::kLongPollTimeout);
    });