Creating a 2×2 Security Camera Matrix using FFmpeg: https://youtu.be/Xr1XTl0cAAE
Streaming an IP Camera to a Web Browser using FFmpeg: https://youtu.be/ztjT2YqQ2Hc
Raspberry Pi (Amazon Affiliate)
US: https://amzn.to/2LpyVob
UK: https://amzn.to/2Z2inKX
CA: https://amzn.to/2y5yAUA
ES: https://amzn.to/3fSDhSS
FR: https://amzn.to/2LpurxT
IT: https://amzn.to/2T2VZNu
DE: https://amzn.to/3buHRmQ
IN: https://amzn.to/2B3PGTN
Install nginx and ffmpeg
sudo apt install nginx ffmpeg
Create Index file at /var/www/html/index.html
Be sure to update the ip_address with the ip address of your server.
<!DOCTYPE html>
<html lang="en">
<head>
<title>Live Cam</title>
</head>
<body>
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
<video id="video1" autoplay controls="controls"></video>
<script>
if (Hls.isSupported()) {
var video1 = document.getElementById('video1');
var hls1 = new Hls();
// bind them together
hls1.attachMedia(video1);
hls1.on(Hls.Events.MEDIA_ATTACHED, function () {
console.log("video and hls.js are now bound together !");
hls1.loadSource("http://ip_address/live/stream1/mystream.m3u8");
hls1.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
console.log("manifest loaded, found " + data.levels.length + " quality level");
});
});
}
</script>
<video id="video2" autoplay controls="controls"></video>
<script>
if (Hls.isSupported()) {
var video2 = document.getElementById('video2');
var hls2 = new Hls();
// bind them together
hls2.attachMedia(video2);
hls2.on(Hls.Events.MEDIA_ATTACHED, function () {
console.log("video and hls.js are now bound together !");
hls2.loadSource("http://ip_address/live/stream2/mystream.m3u8");
hls2.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
console.log("manifest loaded, found " + data.levels.length + " quality level");
});
});
}
</script>
<video id="video3" autoplay controls="controls"></video>
<script>
if (Hls.isSupported()) {
var video3 = document.getElementById('video3');
var hls3 = new Hls();
// bind them together
hls3.attachMedia(video3);
hls3.on(Hls.Events.MEDIA_ATTACHED, function () {
console.log("video and hls.js are now bound together !");
hls3.loadSource("http://ip_address/live/stream3/mystream.m3u8");
hls3.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
console.log("manifest loaded, found " + data.levels.length + " quality level");
});
});
}
</script>
<video id="video4" autoplay controls="controls"></video>
<script>
if (Hls.isSupported()) {
var video4 = document.getElementById('video4');
var hls4 = new Hls();
// bind them together
hls4.attachMedia(video4);
hls4.on(Hls.Events.MEDIA_ATTACHED, function () {
console.log("video and hls.js are now bound together !");
hls4.loadSource("http://ip_address/live/stream4/mystream.m3u8");
hls4.on(Hls.Events.MANIFEST_PARSED, function (event, data) {
console.log("manifest loaded, found " + data.levels.length + " quality level");
});
});
}
</script>
</body>
</html>
Create Live Stream Directory
sudo mkdir -p /var/www/html/live
Add to fstab to Create 100MB RAM disk
tmpfs /var/www/html/live tmpfs defaults,noatime,nosuid,mode=0755,size=100m 0 0
Create Stream File at /var/www/html/stream1.sh for Each Stream
This is configured for an Amcrest IP Camera but should work with other sources.
#!/bin/bash mkdir -p /var/www/html/live/stream1/ VIDSOURCE="rtsp://username:password@ip_address:554/cam/realmonitor?channel=1&subtype=1" VIDEO_OPTS="-vcodec copy" OUTPUT_HLS="-f hls -hls_time 10 -hls_list_size 10 -hls_flags delete_segments -start_number 1" ffmpeg -nostdin -hide_banner -loglevel panic -i "$VIDSOURCE" -y $VIDEO_OPTS $OUTPUT_HLS /var/www/html/live/stream1/mystream.m3u8
In the /var/www/html Directory, Start Each Stream in the Background
sudo ./stream1.sh & sudo ./stream2.sh & sudo ./stream3.sh & sudo ./stream4.sh &
Stop All Streams
sudo killall ffmpeg
