Glossary

AAC – is a proprietary lossy audio file format.

Adaptive streaming – is a type of streaming where several streams with different bitrates are created on the server for the same broadcast. On the user side, the player will request, receive and play exactly the stream that best suits the existing Internet connection.
Or it's type of streaming where the encoder able to change the stream bitrate "on fly" according to network information from the decoder side, for example WebRTC streaming, able to change the encoding bitrate based on the decoder network statistics like RTT and network bandwidth.

API – Application Programming Interface. A description of the ways (a set of classes, procedures, functions, structures, or constants) that one computer program can communicate with another program.

Bandwidth – the amount of information that can pass through a channel in a given time interval.

Bitrate – is the number of bits used to transmit and process data per unit of time. Bitrate is used when measuring the effective transmission rate of a data stream over a channel, i.e. the minimum size of the channel that will be able to pass this stream without delay.

Codec – a device or program capable of performing data or signal conversion. A stream or signal is encoded using a codec to store, transmit or compress it, and decoded to view or modify it. Codecs are used in digital video and audio processing.

Latency – is the difference in time between when a video frame is captured by the device (encoded via encoder) and when that frame is played back on the end user's display.
It's actually the time difference between frame initial point and time when this frame will be rendered on the screen.

HLS (HTTP Live Streaming) – is an HTTP-based media streaming protocol developed by Apple. It is based on the principle of dividing an entire stream into small fragments that are sequentially downloaded via HTTP. The stream is continuous and can theoretically be infinite. At the beginning of a session, a playlist is downloaded in M3U format, which contains metadata about the nested streams.

H.265/HEVC – is a video compression format using more efficient algorithms than H.264/AVC. It is a joint development of the ITU-T VCEG and the MPEG Moving Picture Expert Group. The standard is designed for Internet streaming, videoconferencing, television broadcasting, etc. Frame formats up to 8K (UHDTV) with a resolution of 8192×4320 pixels are supported.

RTMP (Real Time Messaging Protocol) – is a streaming protocol developed by Macromedia (now Adobe). It is mainly used for streaming video and audio streams from webcams over the Internet.

RTT (round-trip time) – is the time it takes to send a signal plus the time it takes to acknowledge that a signal has been received.

SRT (Secure Reliable Transport) – is a protocol developed by Haivision in 2012. SRT is based on UDT (UDP-based Data Transfer Protocol) and ARQ packet recovery technology. It supports AES-128 and AES-256 encryption, as well as listener, caller, and rendezvous modes that allow connections through firewalls and NATs.

WebRTC (Web Real Time Communications) is a standard that describes how audio, video and content are streamed between browsers (without installing plug-ins or other extensions) or other supporting applications in real time. This technology makes it possible to turn a browser into a video conferencing terminal.

Signaling server is the focal point of WebRTC, which provides communication between clients, initializing and closing the connection, and error reporting.

BMD (Blackmagic Design) is one of the leading companies in the development of the latest video technology and production of hardware solutions for working with video.

Transwrapping is the conversion of a file from one encoding method to another.

Nvidia NVDEC is a hardware decoder in Nvidia video cards that provides fully accelerated hardware video decoding. Nvidia NVENC is a hardware real-time multithreaded video encoder.

NDI® is a standard developed by NewTek for video sharing on a local network. NDI® allows multiple video systems to find each other and communicate within a local network, encoding, transmitting and receiving multiple audio and video streams with low latency in real time. Can work bi-directionally with multiple video streams over a common connection.

SDI (Serial Digital Interface) is a digital signal distribution standard developed specifically for the television industry.

Chroma key - the technology of combining two or more images or frames in one composition, color electronic rear projection, used in television and in modern digital film production technology.