Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

JavaScript library to handle media streams on the command line (Node.js) and in the browser.

License

NotificationsYou must be signed in to change notification settings

AxisCommunications/media-stream-library-js

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CINPM

Installation

Make sure you have Node installed on your machine.Then, to install the library:

npm install media-stream-library

or

yarn add media-stream-library

Streams

Provides a way to play RTP streams (H.264/AAC or JPEG) in a browser by convertingthem to ISO-BMFF and feeding the stream to a SourceBuffer using theMedia SourceExtensions standard. The RTSP server shouldprovide two-way communication over WebSocket.Additionally, streaming MP4 over HTTP to a SourceBuffer is also provided asa way to lower latency compared to using a URL on a video tag directly.

This library is not a full media player: the framework provides no videocontrols, progress bar, or other features typically associated with a mediaplayer. For a simple React-based player we refer to theplayer.

However, getting video to play in the browser is quite easy (check the browserexample). There are currently no codecs included either, we rely on browsersupport for that.

Importing

script tag You can directly include themsl-streams.min.js file (availableas GitHub release asset) in your browser (check the browser example) asan ES module. For this to work, your own script needs to be a module (usetype="module" in the script tag). Make sure thesrc in your script tagmatches the path of the file on the server, e.g. if it's at the top level:

<script type="module" scr="/index.js"></script>

and inindex.js you would use:

import{...}from'/msl-streams.min.js';

bundler Alternatively, you use import statements in your JS/TS code usingthe package name if you are going to bundle it yourself:

import{...}from'media-stream-library';

Components and pipelines

The library contains a collection of components that can be connected togetherto form media pipelines. The components are a low-level abstraction on top ofthe Web Streams API to allow two-way communication, while media pipelines aresets of components where the streams are connected. The provided pipelines area minimal set that provide WebSocket+RTSP => H.264/AAC or JPEG, and HTTP+MP4,with some extra functionality such as authentication, retry, capture. For moreadvanced usage, you can construct your own pipelines using the provided ones asa template.

Check theexamples section to see how these can be used in your own code. Torun the examples yourself, you'll need to clone this repository loccally andfollow the developer instructions.

Player

A video player based on React intended primarily for Axis cameras. The mainidea is to define the video state entirely within specialized React componentsfor each of the different supported formats (currently MP4 over HTTP, RTP overWebSocket, and still images). The main video player will only handle theintended video state (attached to handlers) and format. The player is built ontop ofstreams which provides basic pipeline functionalityfor the different formats.

You can either import thePlayer orBasicPlayer and use them directly (seethe example applications). If you want to build your own customized player, youcan look at the latter component and build your own player, using theContainer,Layer, andPlaybackArea components.

Basic requirements

The player specifically targetsAXIS IP cameras becausewe make underlying API-calls to AXIS specfic APIs to get the video streams.

Firmware requirements

  • For WebSocket+RTSP to work you need at least firmware 6.50 (LTS)
  • For HTTP+MP4 to work you need at least firmware 9.80 (LTS)

Importing

If you don't use the player as part of a React app, the easiestway to use it is to download themsl-player.min.js file from the[releases](https://github.com/AxisCommunications/media-stream-library-js/releases/latest) page and include it as an ES module. Make sure your own script hastype="module" and then import directly from the file, e.g.:

import{...}from'/msl-player.min.js';

Then, you can use the<media-stream-player/> tag, similar to how you would use<video/> to include a video element, and provide the camera IP as hostname:

<media-stream-playerhostname="192.168.0.90"/>

You can find an example of this underexample-player-webcomponent.

Supported properties right now are:

PropertyComment
variantSupported choices arebasic oradvanced. Refers toBasicPlayer andPlayer.
hostnameThe ip address to your device
autoplayIf the property exists, we try and autoplay your video
autoretryIf the property exists, we try to auto retry your video on errors and if ended
secureIf the property exists, we will connect with https instead of http
formatAccepted values areJPEG,RTP_JPEG,RTP_H264, orMP4_H264
compressionAccepted values are0..100, with 10 between each step
resolutionWritten as WidthXHeight, eg1920x1080
rotationAccepted values are0,90,180 and270
cameraAccepted values are0...n orquad depending on your device
RTP_H264 / RTP_JPEG / MP4_H264 specific properties
fpsAccepted values are0...n
audioAccepted values are0 (off) and1 (on)
clockAccepted values are0 (hide) and1 (show)
dateAccepted values are0 (hide) and1 (show)
textAccepted values are0 (hide text overlay) and1 (show text overlay)
textstringA percent-encoded string for the text overlay
textcolorAccepted values areblack andwhite
textbackgroundcolorAccepted values areblack,white,transparent andsemitransparent
textposAccepted values are0 (top) and1 (bottom)

Example:

<media-stream-playerhostname="192.168.0.90"format="RTP_H264"autoplay/>

You may need to start a localhost server to get H.264 or Motion JPEG video torun properly. It doesn't work with thefile:/// protocol. The easiest way todo that is to run:

First run

just run example-player-webcomponent

Note that using anything other than the actual hostname you're hosting fromwill result in CORS errors for some video formats. You'll need to proxy thecamera or load a page from the camera (in which case you can setwindow.location.host as the hostname).

As part of your React application

If you want to import the player as a React component into your own code, or useparts of the player, you'll need to install the package as a dependency.You will also need to install a number of peer dependenciessuch asluxon, which we use for date and time purposes,react/react-dom.You can find an example of this underexample-player-react, e.g.:

import{BasicPlayer}from'media-stream-library/player'

To run our example react app, you can start a vite dev server with:

export MSP_CAMERA=http://192.168.0.90cd playernode vite.mjs

where you specify the IP of the camera you want to proxy as theMSP_CAMERAenvironment variable (default is192.168.0.90). The vite dev server willproxy requests to the camera, so that you'll have no CORS issues.

Overlay

A small React library to make it easier to draw SVG elements with a customuser-defined coordinate system, especially when there is also a transformationmapping the user coordinates onto the visible area.

A typical example of this is drawing overlays on top of a transformed image,when the overlay coordinates are relative to the non-transformed image.In that case, the coordinates are often relative to the image size, and notthe actual SVG drawing size.

In addition, a set of helper components and hooks are provided to make it easierto control drawing (e.g. by offering clamping functions), or make it simpler tomanipulate objects (e.g. dragging).

Importing

Check theexample-overlay-react/ directory for an example on how to use thislibrary with your application.

Coordinate conversion

The main component is calledFoundation, and provides you with the functionsthat transform between user and SVG coordinate systems. This is basically allyou ever need, and what this library is about.

To show how this works, let's say you want to draw a rectangle on top of animage of your cat (1920x1080), around the cat's face, and you now thecoordinates of the face in the image. The size of the drawing area in thebrowser is 800x450 pixels (the viewbox of the SVG element overlaying theimage).

The first example shows a situation where you have the image's resolution ascoordinates (pixel coordinates):

      User coordinate system =                  SVG coordinate system        Image coordinates                 x                                          x           +---------->                               +---------->   (0,0)                                      (0,0)     +----------------------+                   +----------------------+     |                      |                   |                      |  +  |               /\_/\  |                +  |              XXXXXXX |  |  |              ( o.o ) |                |  |              X o.o X |  |  |               > ^ <  |                |  |              XXXXXXX |y |  |                      |    +------>  y |  |                      |  |  |                      |                |  |                      |  |  |                      |                |  |                      |  v  |                      |                v  |                      |     |                      |                   |                      |     +----------------------+                   +----------------------+                       (1920,1080)                                 (800,450)

in this case it would be trivial to overlay an SVG and convert the sun'scoordinates to SVG coordinates, and use them for the<circle>cx andcy,you just scale 1920 to 800 and 1080 to 450.

However, you might only have the coordinates of the face relative to thepicture boundaries:

      User coordinate system =                  SVG coordinate system    Relative image coordinates               x                                            x        +-------------->                              +---------->                          (1,1)               (0,0)     +----------------------+                   +----------------------+     |                      |                   |                      | ^   |               /\_/\  |                +  |              XXXXXXX | |   |              ( o.o ) |                |  |              X o.o X | |   |               > ^ <  |                |  |              XXXXXXX | | y |                      |      +---->  y |  |                      | |   |                      |                |  |                      | |   |                      |                |  |                      | +   |                      |                v  |                      |     |                      |                   |                      |     +----------------------+                   +----------------------+   (0,0)                                                           (800,450)

where now you would have to take into account the reversal of the y coordinatesas well, so the face which is approximately at a y coordinate of 0.66 would turnout to have SVG y coordinate of around 150.

As a third example, you still have the realtive coordinates of the face to thewhole picture, but only part of the picture is shown:

             User coordinate system                                        (1,1)     +------------------------------------+     |                                    |     |               Visible area         |    SVG coordinate system     |                                    |     |         (0.4,0.8)        (0.9,0.8) |  (0,0)     |             +----------------+     |    +----------------+     |             |        /\_/\   |     |    |       XXXXXXX  |     |             |       ( o.o )  |     |    |       X o.o X  |     |             |        > ^ <   | +------> |       XXXXXXX  |     |             |                |     |    |                |     |             +----------------+     |    +----------------+     |         (0.4,0.5)                  |                 (800,450)     |                                    |     |                                    |     |                                    |     |                                    |     |                                    |     |                                    |     |                                    |     |                                    |     +------------------------------------+   (0,0)

in which case you'll need a transformation to take into account how the visiblearea maps onto the complete area, before you can determine the final SVG coordinates.

This library aims to take care of all these conversions for you, as longas you can defined your user coordinate system (with the location of the "real"objects), and an optional transformation matrix (describing the visible areathe SVG overlay applies to). The fact that this mainly comes in handy whenmatching what you draw to some underlying image and coordinate system isthe reason the name of this library is the way it is.

Utilities

Other than coordinate conversion, there are also a couple of utilities aimed tomake it easier to interact with the SVG components.

Convenience functions for clamping are provided by theLiner component, whichlets you specify an area to which to limit your components. There is also auseDraggable hook to simplify having to deal with moving around stuff.

Components

With the React SVG elements and utilities as building blocks, you can then makeyour own SVG components that will be used inside theFoundation component.The best way to get started is to have a look at the example section, whichshows how you can build your components to make use of this library. Theexample can be run withjust run overlay. Instead of defining a whole array of newSVG components that wrap the browser elements, the idea is that you can easilydo this already with React, and therefore we focused on providing the basics toaid with building your components, instead of creating a component library.

Contributing

For development, you'll need a local installation ofNode.js,andyarn to install dependencies.To run commands, you needjust, which can be installed usingprebuilt binaries oryarn, e.g.yarn global add just-install.

Please read ourcontributing guidelines before making pullrequests.

FAQ

Will it work with this library if it works with VLC?

Not necessarily. We only support a particular subset of the protocol useful forbasic streaming from IP cameras. With RTSP that is H.264+AAC or JPEG, and onlysome simple profiles/levels are supported. For MP4, it depends entirely on yourbrowser if the media can be played.

Do I need to use RTSP for live (or low-latency) video?

Since this library only supports RTSP through some form of TCP connection, it'sgoing to have similar latency as streaming MP4 over HTTP. For true low-latencyreal-time usage, you should either use a stand-alone player that can handle RTP over UDP,or use WebRTC in the browser.

You should expect in-browser latency of several frames. When using Firefox, youmight need to set the duration of the MediaSource to0 to force live behaviourwith lower latency (see one of the browser examples).The exact latency is controlled by the browser itself, and the data inside themedia stream can affect this (e.g. if audio is present or not).

Does this library support audio?

Yes, yes it does. With a few caveats though.

  • Make sure your AXIS camera actually supports audio
  • Make sure the audio is enabled on the camera.
  • It only works with H.264 and only after user interaction with the volume slider

How do I autoplay video?

Browsers will only allow to autoplay a video if it's muted. If the video isnot muted, then it will only play if theplay method was called from insidea handler of a user-initiated event. Note that different browsers can havedifferent behaviours. Readhttps://developer.chrome.com/blog/autoplay for moredetails.

Acknowledgements

The icons used are fromhttps://github.com/google/material-design-icons/, whichare available under the Apache 2.0 license, more information can be found on:http://google.github.io/material-design-icons

The spinner is fromhttps://github.com/SamHerbert/SVG-Loaders, available underthe MIT license.

Packages

 
 
 

Contributors28


[8]ページ先頭

©2009-2025 Movatter.jp