- model: Raspberry Pi 4B
- system: Ubuntu 22.04 aarch64
Through resources of Pi http streamings are widely available over the Internet, few address the libcamera library which is the only option under aarch64, and few address the low-latency part of the streaming.
I managed to achieve the above with the following:
- Ubuntu Server 22.04 LTS x64 as system
- kbingham/libcamera
- raspberrypi/libcamera-apps for libcamera-vid
- novnc/websockify to proxy tcp port to websocket
- nginx/nginx to serve html
- samirkumardas/jmuxer to render h264 on clients side
This process has a very low latency due to the rendering workload being undertaken by the client, while Pi just hardware encoding the video into h264 and serving the binary directly.
pipeline
install
- compile libcamera following the guide
- compile libcamera-apps following the guide
- install websockify following the guide
- launch the following, or optionally create systemd services of these commands
libcamera-vid -t 0 --width 1920 --height 1080 --inline --listen -o tcp://0.0.0.0:8000 websockify 0.0.0.0:8001 0.0.0.0:8000
-
create static nginx html server with
/rpicam/index.html
as following<head> <meta name="color-scheme" content="dark"> <script type="text/javascript" src="https://cdn.jsdelivr.net/npm/[email protected]/dist/jmuxer.min.js"></script> <style> body{ margin: 0; } </style> </head> <body> <div id="container" style="width: 100%; margin: 0 auto;"> <div class="vsc-controller"></div> <video width="100%" autoplay muted id="player"></video> </div> <script> function parse(data) { var input = new Uint8Array(data), video = input; return { video: video, }; } window.onload = function () { var socketURL = document.location.href.replace('http', 'ws')+'ws/'; var jmuxer = new JMuxer({ node: 'player', mode: 'video', flushingTime: 0, fps: 30, debug: false }); var ws = new WebSocket(socketURL); ws.binaryType = 'arraybuffer'; ws.addEventListener('message', function (event) { var data = parse(event.data); jmuxer.feed(data); }); } </script> </body>
- configure nginx server to serve
/rpicam/
to static file directory, and/rpicam/ws/
to0.0.0.0:8001
. Typical configuration of the later is as following.location /rpicam/ws/ { proxy_redirect off; proxy_pass http://0.0.0.0:8001; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; }
known issues and solutions
- current configuration can support only one client at a time
- a dedicated backend to stream tcp/websocket data to multiple clients would be necessary
- libcamera-vid will exit each time websocket is closed
- auto restart is possible using systemd
- if one have a dedicated backend this can be managed
- no audio is served. jmuxer supports audio, to combine audio one will have to either
- write a dedicated backend to repackage the binary data by adding audio and duration information
- or serve audio through a seperate websocket, but could lead to audio/video being not synchronized
For now I am happy with the result, but a dedicated backend would be optimal/necessary to solve these issues.