added a story about gstreamer-webrtc
This commit is contained in:
parent
a86f21da4b
commit
6fb7b744e3
13 changed files with 342 additions and 27 deletions
138
content/posts/webrtc-and-gstreamer.md
Normal file
138
content/posts/webrtc-and-gstreamer.md
Normal file
|
@ -0,0 +1,138 @@
|
|||
---
|
||||
title: "My Trials with WebRTC and Gstreamer"
|
||||
draft: false
|
||||
---
|
||||
|
||||
# But why?
|
||||
The most important part of any software, is the purpose - or at least a goal. If you don't have a goal, just write a
|
||||
```rust
|
||||
fn main() {
|
||||
while true {}
|
||||
}
|
||||
```
|
||||
and call it a day. You made the computer do something!
|
||||
|
||||
So what is the goal here? I have a rust program, that uses [Tauri](https://tauri.app) as the front end,
|
||||
and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.
|
||||
|
||||
The Tauri/controller needs to display the video feed from the raspberry pi so that the end-user can
|
||||
see what the ML model is seeing.
|
||||
|
||||
## But why WebRTC?
|
||||
Well, I'm really just using a web-browser for my frontend, so I need a web-ready video streaming technology
|
||||
that actually does smart scaling and all the hard stuff I don't want to deal with.
|
||||
|
||||
And I thought it would be easier than rolling my own. ~~and I'm unsure about that now, but sunk cost fallacy + learning something
|
||||
new is pretty compelling~~
|
||||
|
||||
|
||||
## What is WebRTC
|
||||
A browser-standardized and implemented data communication layer primarily used for peer-to-peer (or p2p) video and audio connections.
|
||||
|
||||
What does this mean? You pass some information to the browser with a javascript API, and your video element magically starts receiving
|
||||
video and audio! Compensating for network status in a way that focuses on real-time video over consistent video. Sounds great!
|
||||
|
||||
## An early warning
|
||||
I thought it would be as easy as "there's my destination browser, try to start a connection", and I was wrong.
|
||||
|
||||
WebRTC is an incredibly flexible system. Here's a quote from the [Mozilla Documentation](https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Session_lifetime#information_exchanged_during_signaling)
|
||||
(right above where this link takes you to):
|
||||
> It's also worth noting that the channel for performing signaling doesn't even need to be over the network. One peer can output a data object that can be printed out, physically carried (on foot or by carrier pigeon) to another device, entered into that device, and a response then output by that device to be returned on foot, and so forth, until the WebRTC peer connection is open. It'd be very high latency but it could be done.
|
||||
|
||||
And when you combine this with gstreamer, too many hours were lost in the making of this blog post.
|
||||
|
||||
|
||||
# Signaling
|
||||
I'm sure most readers know that the internet is a very large, untamed landscape of legacy systems, and petabytes of information transfer.
|
||||
I'm sure most readers are also familier with certain issues caused by trying to get computer A to talk to computer B.
|
||||
|
||||
And WebRTC basically said "that's a can of worms we aren't going to try to standardize", and gave us all the interfaces to implement that
|
||||
part ourselves.
|
||||
|
||||
What this means, is that you need a 'signaling' server to be able to connect two WebRTC endpoints. This server handles most setup communication
|
||||
for the WebRTC clients (because this is p2p, there is no computer 'in-charge' of the WebRTC connection) until the clients have established
|
||||
a connection.
|
||||
|
||||
What I'm about to detail is just a summarization of the [Mozilla documention](https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Connectivity) which is
|
||||
definitely worth a read if you want to do this yourself.
|
||||
|
||||
But long-story short, one client gets prepped for starting a connection by
|
||||
1. enumerating the data streams it wants to transmit
|
||||
2. creating a description of itself
|
||||
3. handing the message to the server and saying "send this to the other person please"
|
||||
|
||||
"But wait!" you might say. "This is *what* the internet does!" And you would be correct. Except, in this case, setting up this communication-enabling server is an
|
||||
excersize for the reader.
|
||||
|
||||
The server then sends the message to the other client, client B, who takes it, reads it, and:
|
||||
1. enumerates its own data streams
|
||||
2. creates a description of itself
|
||||
3. sends a "yes, I would like to start a WebRTC connection" back to the server
|
||||
|
||||
and this continues as the two clients nail down specifics like "What's your IP address?" and "what media formats can your provide? I'll let you know which of those I want"
|
||||
and the networking classic "Well shoot. You're behind a NAT. Let's figure this out".
|
||||
|
||||
So as you might guess, this is where most of my time is going to get spent!
|
||||
|
||||
# Gstreamer's Gst-WebRTCSink
|
||||
For those unfamiliar with Gstreamer, all you really need to understand is that it basically a wrapper for connecting all of the Gstreamer elements.
|
||||
You take a bunch of Gstreamer elements, and tell Gstreamer to connect them together into a "data pipeline" that happens to be audio and visual data.
|
||||
(this ignores pipeline management and clock timing, and event buses it actually does)
|
||||
|
||||
There are components for taking video and video from webcams. There are components for changing the framerate and resolution. You can apply audio and visual effects in real time!
|
||||
But most importantly (here) is that there is a plugin for "plug and play" WebRTC serving. It's part of the repository of rust gstreamer elements over at [gst-plugins-rs](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs).
|
||||
Specifically the 'net/webrtc' folder.
|
||||
|
||||
And it gives a simple usage example that requires `cargo`, `npm`, `npm install webpack`, and obviously the streamer `gst-launch-1.0` utility,
|
||||
and three terminal windows. This was really my first "wait, I thought this would be easy" moment.
|
||||
|
||||
Some of you are probably want me to add comments to this post just so you can say "skill issue". That may be true! I've never done
|
||||
internet protocol implemention, so most of this seems rather complicated when I just wanted a raspberry pi to stream video to a computer
|
||||
screen. But it's also a great learning experience! And I will take it as such. By finding as many how-to-guides will get me mostly working.
|
||||
I learn best with something that I can iterate on and learn the fundementals of over time isntead of front-loading the spec into my brain.
|
||||
|
||||
|
||||
The Gstreamer `webrtcsink` repo-page specifically has it's own signaling server available, with examples and everything! Execpt! Well... You need to know
|
||||
the IP of said signaling server on the startup of the gstreamer pipeline, which doesn't allow for any of the "you handle message passing" that WebRTC
|
||||
was designed to provide... For this, you need to create a `Signaler` object that implements an interface as defined in [the example](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/blob/main/net/webrtc/examples/webrtcsink-custom-signaller/signaller/imp.rs).
|
||||
So I'm going to start there. Can't be too hard, right? ... right?
|
||||
|
||||
The idea is that by implementing it yourself, you can integrate it with existing message passing software, like that websocket I know you are already using (dont' worry,
|
||||
I'm using one too).
|
||||
|
||||
There's just one hiccup. That link that I referenced with an example? It uses relative links to the rest of the workspace it is in, and google, duckduckgo, github, and sourcegraph, all could find
|
||||
no implmentation of this example in the wild (at least with most of the keywords I was using. I did eventually find [this github repo](https://github.com/Eyevinn/srt-whep) that uses it),
|
||||
and I spent at least 3 hours just to relized I couldn't use a direct github Cargo import, and instead needed to use the `gst-plugin-webrtc` crate that google couldn't find.
|
||||
|
||||
Because the docs.rs page is broken and doesn't build...
|
||||
|
||||
#### Gstreamer crate tanget
|
||||
As a quick aside, I would like to mention that the gstreamer crate system is both kinda neat, but mostly a pain to work with.
|
||||
|
||||
There is a module for just about everything that could be modularized. Just for my "take video imput, output it over webrtc", I need these four crates.
|
||||
```toml
|
||||
gst-plugin-webrtc = "0.13.0"
|
||||
gstreamer = { version = "0.23.0", features = ["v1_22"] }
|
||||
gstreamer-sdp = { version = "0.23.0", features = ["v1_22"] }
|
||||
gstreamer-webrtc = { version = "0.23.0", features = ["v1_22"] }
|
||||
```
|
||||
That's not what my issue is though. My issue is that it took me 2-3 hours to discover most of these crates existed!
|
||||
|
||||
Because there is not centralized list of "these are the plugins, their crates, their included features" and such, when google fails, it becomes
|
||||
almost impossible to unearth them.
|
||||
|
||||
I'm not saying this as a critique of the Gstreamer ecosystem, but as someone with trauma.
|
||||
|
||||
# Now for the meaty part
|
||||
So I've finally gotten a compiling [(See here)](https://github.com/Nickiel12/gst-webrtc-example-signaller), now to actually implement the signalling server, right!
|
||||
|
||||
well... Uhm, so. Here's the thing. I just need to be able to set up a tauri <video> tag to point to a video source, right? Wouldn't it be nice if there was a nice and easy standard for this so I didn't
|
||||
have to implement my own WebRTC signalling server? Enter WHIP.
|
||||
|
||||
# The rug pull
|
||||
So, yeah. I'm just going to use [MediaMTX](https://github.com/bluenviron/mediamtx) until that fails, with a gstreamer rstp source.
|
||||
|
||||
Bye!
|
||||
|
||||
|
||||
|
|
@ -46,7 +46,7 @@
|
|||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -4,7 +4,7 @@
|
|||
<title>Categories on Nickiel's Static Pages</title>
|
||||
<link>https://staticpages.nickiel.net/categories/</link>
|
||||
<description>Recent content in Categories on Nickiel's Static Pages</description>
|
||||
<generator>Hugo -- gohugo.io</generator>
|
||||
<generator>Hugo</generator>
|
||||
<language>en-us</language>
|
||||
<atom:link href="https://staticpages.nickiel.net/categories/index.xml" rel="self" type="application/rss+xml" />
|
||||
</channel>
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en-us" dir="ltr">
|
||||
<head>
|
||||
<meta name="generator" content="Hugo 0.120.3">
|
||||
<meta name="generator" content="Hugo 0.129.0">
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width">
|
||||
<title>Nickiel's Static Pages</title>
|
||||
|
@ -50,6 +50,11 @@
|
|||
Credential file The credential file for this script, looks like this:
|
||||
user="TheUserName" pswd="theworldsmostsecurepasswordbecauseitislong" email_addr="destination_addr@example.com" primary_note_id="id of the source note" logging_note_id="id of the output note" server_url="the base domain of the server you are using" Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.
|
||||
|
||||
<h2><a href="/posts/webrtc-and-gstreamer/">My Trials with WebRTC and Gstreamer</a></h2>
|
||||
But why? The most important part of any software, is the purpose - or at least a goal. If you don’t have a goal, just write a
|
||||
fn main() { while true {} } and call it a day. You made the computer do something!
|
||||
So what is the goal here? I have a rust program, that uses Tauri as the front end, and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.
|
||||
|
||||
<h2><a href="/posts/wgu_instructions/">WGU Capstone Setup Instructions</a></h2>
|
||||
Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
|
||||
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
|
||||
|
@ -57,10 +62,14 @@ See Step 1 - Prerequisites on what is required before you can run this project.
|
|||
Project Structure Below you can find the default project folder structure after cloning it:
|
||||
WGU-Capstone ├.gitignore ├Main.py ├README.
|
||||
|
||||
<h2><a href="/zig-bytes/zig_bytes_0/">Zig Bytes lesson 0: Installation</a></h2>
|
||||
Introduction Greetings! This blog is a companion piece to the series by the Zig Is Great YouTube series. The same concepts are covered in both this and the videos, but some people (like myself) like being able to read the documentation as well as watch it.
|
||||
Let’s get started! Step 1: Accompanying Files We will be using the Zig-Bytes repository as a companion to these lessons, so go ahead and get a local copy.
|
||||
|
||||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -4,7 +4,7 @@
|
|||
<title>Nickiel's Static Pages</title>
|
||||
<link>https://staticpages.nickiel.net/</link>
|
||||
<description>Recent content on Nickiel's Static Pages</description>
|
||||
<generator>Hugo -- gohugo.io</generator>
|
||||
<generator>Hugo</generator>
|
||||
<language>en-us</language>
|
||||
<atom:link href="https://staticpages.nickiel.net/index.xml" rel="self" type="application/rss+xml" />
|
||||
<item>
|
||||
|
@ -12,20 +12,28 @@
|
|||
<link>https://staticpages.nickiel.net/posts/chrono_track/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/chrono_track/</guid>
|
||||
<description>Chrono Track User Manual Welcome not so weary traveler! Odds are pretty high that you had no issue finding this, because if you did find it, odds are the creator gave it to you.
|
||||
Credential file The credential file for this script, looks like this:
|
||||
user=&#34;TheUserName&#34; pswd=&#34;theworldsmostsecurepasswordbecauseitislong&#34; email_addr=&#34;destination_addr@example.com&#34; primary_note_id=&#34;id of the source note&#34; logging_note_id=&#34;id of the output note&#34; server_url=&#34;the base domain of the server you are using&#34; Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.</description>
|
||||
<description>Chrono Track User Manual Welcome not so weary traveler! Odds are pretty high that you had no issue finding this, because if you did find it, odds are the creator gave it to you.
Credential file The credential file for this script, looks like this:
user=&#34;TheUserName&#34; pswd=&#34;theworldsmostsecurepasswordbecauseitislong&#34; email_addr=&#34;destination_addr@example.com&#34; primary_note_id=&#34;id of the source note&#34; logging_note_id=&#34;id of the output note&#34; server_url=&#34;the base domain of the server you are using&#34; Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.</description>
|
||||
</item>
|
||||
<item>
|
||||
<title>My Trials with WebRTC and Gstreamer</title>
|
||||
<link>https://staticpages.nickiel.net/posts/webrtc-and-gstreamer/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/webrtc-and-gstreamer/</guid>
|
||||
<description>But why? The most important part of any software, is the purpose - or at least a goal. If you don&rsquo;t have a goal, just write a
fn main() { while true {} } and call it a day. You made the computer do something!
So what is the goal here? I have a rust program, that uses Tauri as the front end, and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.</description>
|
||||
</item>
|
||||
<item>
|
||||
<title>WGU Capstone Setup Instructions</title>
|
||||
<link>https://staticpages.nickiel.net/posts/wgu_instructions/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/wgu_instructions/</guid>
|
||||
<description>Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
|
||||
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
|
||||
See Step 1 - Prerequisites on what is required before you can run this project.
|
||||
Project Structure Below you can find the default project folder structure after cloning it:
|
||||
WGU-Capstone ├.gitignore ├Main.py ├README.</description>
|
||||
<description>Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
See Step 1 - Prerequisites on what is required before you can run this project.
Project Structure Below you can find the default project folder structure after cloning it:
WGU-Capstone ├.gitignore ├Main.py ├README.</description>
|
||||
</item>
|
||||
<item>
|
||||
<title>Zig Bytes lesson 0: Installation</title>
|
||||
<link>https://staticpages.nickiel.net/zig-bytes/zig_bytes_0/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/zig-bytes/zig_bytes_0/</guid>
|
||||
<description>Introduction Greetings! This blog is a companion piece to the series by the Zig Is Great YouTube series. The same concepts are covered in both this and the videos, but some people (like myself) like being able to read the documentation as well as watch it.
Let&rsquo;s get started! Step 1: Accompanying Files We will be using the Zig-Bytes repository as a companion to these lessons, so go ahead and get a local copy.</description>
|
||||
</item>
|
||||
</channel>
|
||||
</rss>
|
||||
|
|
|
@ -85,7 +85,7 @@
|
|||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -48,6 +48,11 @@
|
|||
Credential file The credential file for this script, looks like this:
|
||||
user="TheUserName" pswd="theworldsmostsecurepasswordbecauseitislong" email_addr="destination_addr@example.com" primary_note_id="id of the source note" logging_note_id="id of the output note" server_url="the base domain of the server you are using" Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.
|
||||
|
||||
<h2><a href="/posts/webrtc-and-gstreamer/">My Trials with WebRTC and Gstreamer</a></h2>
|
||||
But why? The most important part of any software, is the purpose - or at least a goal. If you don’t have a goal, just write a
|
||||
fn main() { while true {} } and call it a day. You made the computer do something!
|
||||
So what is the goal here? I have a rust program, that uses Tauri as the front end, and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.
|
||||
|
||||
<h2><a href="/posts/wgu_instructions/">WGU Capstone Setup Instructions</a></h2>
|
||||
Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
|
||||
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
|
||||
|
@ -58,7 +63,7 @@ WGU-Capstone ├.gitignore ├Main.py ├README.
|
|||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -4,7 +4,7 @@
|
|||
<title>Posts on Nickiel's Static Pages</title>
|
||||
<link>https://staticpages.nickiel.net/posts/</link>
|
||||
<description>Recent content in Posts on Nickiel's Static Pages</description>
|
||||
<generator>Hugo -- gohugo.io</generator>
|
||||
<generator>Hugo</generator>
|
||||
<language>en-us</language>
|
||||
<atom:link href="https://staticpages.nickiel.net/posts/index.xml" rel="self" type="application/rss+xml" />
|
||||
<item>
|
||||
|
@ -12,20 +12,21 @@
|
|||
<link>https://staticpages.nickiel.net/posts/chrono_track/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/chrono_track/</guid>
|
||||
<description>Chrono Track User Manual Welcome not so weary traveler! Odds are pretty high that you had no issue finding this, because if you did find it, odds are the creator gave it to you.
|
||||
Credential file The credential file for this script, looks like this:
|
||||
user=&#34;TheUserName&#34; pswd=&#34;theworldsmostsecurepasswordbecauseitislong&#34; email_addr=&#34;destination_addr@example.com&#34; primary_note_id=&#34;id of the source note&#34; logging_note_id=&#34;id of the output note&#34; server_url=&#34;the base domain of the server you are using&#34; Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.</description>
|
||||
<description>Chrono Track User Manual Welcome not so weary traveler! Odds are pretty high that you had no issue finding this, because if you did find it, odds are the creator gave it to you.
Credential file The credential file for this script, looks like this:
user=&#34;TheUserName&#34; pswd=&#34;theworldsmostsecurepasswordbecauseitislong&#34; email_addr=&#34;destination_addr@example.com&#34; primary_note_id=&#34;id of the source note&#34; logging_note_id=&#34;id of the output note&#34; server_url=&#34;the base domain of the server you are using&#34; Put your logon username (not your display name) into the first line, your logon password on the second line, and the email address you wish to recieve emails on line three.</description>
|
||||
</item>
|
||||
<item>
|
||||
<title>My Trials with WebRTC and Gstreamer</title>
|
||||
<link>https://staticpages.nickiel.net/posts/webrtc-and-gstreamer/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/webrtc-and-gstreamer/</guid>
|
||||
<description>But why? The most important part of any software, is the purpose - or at least a goal. If you don&rsquo;t have a goal, just write a
fn main() { while true {} } and call it a day. You made the computer do something!
So what is the goal here? I have a rust program, that uses Tauri as the front end, and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.</description>
|
||||
</item>
|
||||
<item>
|
||||
<title>WGU Capstone Setup Instructions</title>
|
||||
<link>https://staticpages.nickiel.net/posts/wgu_instructions/</link>
|
||||
<pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
|
||||
<guid>https://staticpages.nickiel.net/posts/wgu_instructions/</guid>
|
||||
<description>Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
|
||||
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
|
||||
See Step 1 - Prerequisites on what is required before you can run this project.
|
||||
Project Structure Below you can find the default project folder structure after cloning it:
|
||||
WGU-Capstone ├.gitignore ├Main.py ├README.</description>
|
||||
<description>Step 0: Clone the repository Before you can run this project, you will need to clone the git repository with the following command:
git clone https://git.nickiel.net/Nickiel/WGU-Capstone See Project Structure for more information on the repository you just cloned.
See Step 1 - Prerequisites on what is required before you can run this project.
Project Structure Below you can find the default project folder structure after cloning it:
WGU-Capstone ├.gitignore ├Main.py ├README.</description>
|
||||
</item>
|
||||
</channel>
|
||||
</rss>
|
||||
|
|
148
public/posts/webrtc-and-gstreamer/index.html
Normal file
148
public/posts/webrtc-and-gstreamer/index.html
Normal file
|
@ -0,0 +1,148 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en-us" dir="ltr">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width">
|
||||
<title>My Trials with WebRTC and Gstreamer | Nickiel's Static Pages</title>
|
||||
<script type="text/javascript" src="/navbar.js"></script>
|
||||
<link rel="stylesheet" href="/catppuccin.css">
|
||||
<link rel="stylesheet" href="/variables.css">
|
||||
|
||||
<link rel="stylesheet" href="/css/main.min.13352547759ac6940d329309b455c1cc5e8b56036fb2faddc77c09b4ee1d3b31.css" integrity="sha256-EzUlR3WaxpQNMpMJtFXBzF6LVgNvsvrdx3wJtO4dOzE=" crossorigin="anonymous">
|
||||
|
||||
|
||||
<script src="/js/main.f2979a93a325fecf9605263bd141398a311c8e23388ed7dcff74f92f7e632866.js" integrity="sha256-8peak6Ml/s+WBSY70UE5ijEcjiM4jtfc/3T5L35jKGY=" crossorigin="anonymous"></script>
|
||||
|
||||
|
||||
</head>
|
||||
<body class="lightmode">
|
||||
<header>
|
||||
|
||||
<nav>
|
||||
<div class="nav-list">
|
||||
<a href="/">Nick's Static Pages</a>
|
||||
<a href="/">Home</a>
|
||||
<a aria-current="true" class="ancestor" href="/posts/">Posts</a>
|
||||
<a href="/tags/">Tags</a>
|
||||
<div class="nav-dropdown">
|
||||
Themes
|
||||
<div class="nav-dropdown-content">
|
||||
<a href="#" onclick="light_mode()">Light Mode</a>
|
||||
<a href="#" onclick="sunset_mode()">Sunset Mode</a>
|
||||
<a href="#" onclick="dark_mode()">Dark Mode</a>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</nav>
|
||||
|
||||
|
||||
</header>
|
||||
<main>
|
||||
|
||||
<h1 id="but-why">But why?</h1>
|
||||
<p>The most important part of any software, is the purpose - or at least a goal. If you don’t have a goal, just write a</p>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-rust" data-lang="rust"><span style="display:flex;"><span><span style="color:#66d9ef">fn</span> <span style="color:#a6e22e">main</span>() {
|
||||
</span></span><span style="display:flex;"><span> <span style="color:#66d9ef">while</span> <span style="color:#66d9ef">true</span> {}
|
||||
</span></span><span style="display:flex;"><span>}
|
||||
</span></span></code></pre></div><p>and call it a day. You made the computer do something!</p>
|
||||
<p>So what is the goal here? I have a rust program, that uses <a href="https://tauri.app">Tauri</a> as the front end,
|
||||
and I have a raspberry pi with a Hailo8 accelerator attached to it, that I need the camera input from.</p>
|
||||
<p>The Tauri/controller needs to display the video feed from the raspberry pi so that the end-user can
|
||||
see what the ML model is seeing.</p>
|
||||
<h2 id="but-why-webrtc">But why WebRTC?</h2>
|
||||
<p>Well, I’m really just using a web-browser for my frontend, so I need a web-ready video streaming technology
|
||||
that actually does smart scaling and all the hard stuff I don’t want to deal with.</p>
|
||||
<p>And I thought it would be easier than rolling my own. <del>and I’m unsure about that now, but sunk cost fallacy + learning something
|
||||
new is pretty compelling</del></p>
|
||||
<h2 id="what-is-webrtc">What is WebRTC</h2>
|
||||
<p>A browser-standardized and implemented data communication layer primarily used for peer-to-peer (or p2p) video and audio connections.</p>
|
||||
<p>What does this mean? You pass some information to the browser with a javascript API, and your video element magically starts receiving
|
||||
video and audio! Compensating for network status in a way that focuses on real-time video over consistent video. Sounds great!</p>
|
||||
<h2 id="an-early-warning">An early warning</h2>
|
||||
<p>I thought it would be as easy as “there’s my destination browser, try to start a connection”, and I was wrong.</p>
|
||||
<p>WebRTC is an incredibly flexible system. Here’s a quote from the <a href="https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Session_lifetime#information_exchanged_during_signaling">Mozilla Documentation</a>
|
||||
(right above where this link takes you to):</p>
|
||||
<blockquote>
|
||||
<p>It’s also worth noting that the channel for performing signaling doesn’t even need to be over the network. One peer can output a data object that can be printed out, physically carried (on foot or by carrier pigeon) to another device, entered into that device, and a response then output by that device to be returned on foot, and so forth, until the WebRTC peer connection is open. It’d be very high latency but it could be done.</p>
|
||||
</blockquote>
|
||||
<p>And when you combine this with gstreamer, too many hours were lost in the making of this blog post.</p>
|
||||
<h1 id="signaling">Signaling</h1>
|
||||
<p>I’m sure most readers know that the internet is a very large, untamed landscape of legacy systems, and petabytes of information transfer.
|
||||
I’m sure most readers are also familier with certain issues caused by trying to get computer A to talk to computer B.</p>
|
||||
<p>And WebRTC basically said “that’s a can of worms we aren’t going to try to standardize”, and gave us all the interfaces to implement that
|
||||
part ourselves.</p>
|
||||
<p>What this means, is that you need a ‘signaling’ server to be able to connect two WebRTC endpoints. This server handles most setup communication
|
||||
for the WebRTC clients (because this is p2p, there is no computer ‘in-charge’ of the WebRTC connection) until the clients have established
|
||||
a connection.</p>
|
||||
<p>What I’m about to detail is just a summarization of the <a href="https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Connectivity">Mozilla documention</a> which is
|
||||
definitely worth a read if you want to do this yourself.</p>
|
||||
<p>But long-story short, one client gets prepped for starting a connection by</p>
|
||||
<ol>
|
||||
<li>enumerating the data streams it wants to transmit</li>
|
||||
<li>creating a description of itself</li>
|
||||
<li>handing the message to the server and saying “send this to the other person please”</li>
|
||||
</ol>
|
||||
<p>“But wait!” you might say. “This is <em>what</em> the internet does!” And you would be correct. Except, in this case, setting up this communication-enabling server is an
|
||||
excersize for the reader.</p>
|
||||
<p>The server then sends the message to the other client, client B, who takes it, reads it, and:</p>
|
||||
<ol>
|
||||
<li>enumerates its own data streams</li>
|
||||
<li>creates a description of itself</li>
|
||||
<li>sends a “yes, I would like to start a WebRTC connection” back to the server</li>
|
||||
</ol>
|
||||
<p>and this continues as the two clients nail down specifics like “What’s your IP address?” and “what media formats can your provide? I’ll let you know which of those I want”
|
||||
and the networking classic “Well shoot. You’re behind a NAT. Let’s figure this out”.</p>
|
||||
<p>So as you might guess, this is where most of my time is going to get spent!</p>
|
||||
<h1 id="gstreamers-gst-webrtcsink">Gstreamer’s Gst-WebRTCSink</h1>
|
||||
<p>For those unfamiliar with Gstreamer, all you really need to understand is that it basically a wrapper for connecting all of the Gstreamer elements.
|
||||
You take a bunch of Gstreamer elements, and tell Gstreamer to connect them together into a “data pipeline” that happens to be audio and visual data.
|
||||
(this ignores pipeline management and clock timing, and event buses it actually does)</p>
|
||||
<p>There are components for taking video and video from webcams. There are components for changing the framerate and resolution. You can apply audio and visual effects in real time!
|
||||
But most importantly (here) is that there is a plugin for “plug and play” WebRTC serving. It’s part of the repository of rust gstreamer elements over at <a href="https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs">gst-plugins-rs</a>.
|
||||
Specifically the ’net/webrtc’ folder.</p>
|
||||
<p>And it gives a simple usage example that requires <code>cargo</code>, <code>npm</code>, <code>npm install webpack</code>, and obviously the streamer <code>gst-launch-1.0</code> utility,
|
||||
and three terminal windows. This was really my first “wait, I thought this would be easy” moment.</p>
|
||||
<p>Some of you are probably want me to add comments to this post just so you can say “skill issue”. That may be true! I’ve never done
|
||||
internet protocol implemention, so most of this seems rather complicated when I just wanted a raspberry pi to stream video to a computer
|
||||
screen. But it’s also a great learning experience! And I will take it as such. By finding as many how-to-guides will get me mostly working.
|
||||
I learn best with something that I can iterate on and learn the fundementals of over time isntead of front-loading the spec into my brain.</p>
|
||||
<p>The Gstreamer <code>webrtcsink</code> repo-page specifically has it’s own signaling server available, with examples and everything! Execpt! Well… You need to know
|
||||
the IP of said signaling server on the startup of the gstreamer pipeline, which doesn’t allow for any of the “you handle message passing” that WebRTC
|
||||
was designed to provide… For this, you need to create a <code>Signaler</code> object that implements an interface as defined in <a href="https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/blob/main/net/webrtc/examples/webrtcsink-custom-signaller/signaller/imp.rs">the example</a>.
|
||||
So I’m going to start there. Can’t be too hard, right? … right?</p>
|
||||
<p>The idea is that by implementing it yourself, you can integrate it with existing message passing software, like that websocket I know you are already using (dont’ worry,
|
||||
I’m using one too).</p>
|
||||
<p>There’s just one hiccup. That link that I referenced with an example? It uses relative links to the rest of the workspace it is in, and google, duckduckgo, github, and sourcegraph, all could find
|
||||
no implmentation of this example in the wild (at least with most of the keywords I was using. I did eventually find <a href="https://github.com/Eyevinn/srt-whep">this github repo</a> that uses it),
|
||||
and I spent at least 3 hours just to relized I couldn’t use a direct github Cargo import, and instead needed to use the <code>gst-plugin-webrtc</code> crate that google couldn’t find.</p>
|
||||
<p>Because the docs.rs page is broken and doesn’t build…</p>
|
||||
<h4 id="gstreamer-crate-tanget">Gstreamer crate tanget</h4>
|
||||
<p>As a quick aside, I would like to mention that the gstreamer crate system is both kinda neat, but mostly a pain to work with.</p>
|
||||
<p>There is a module for just about everything that could be modularized. Just for my “take video imput, output it over webrtc”, I need these four crates.</p>
|
||||
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-toml" data-lang="toml"><span style="display:flex;"><span><span style="color:#a6e22e">gst-plugin-webrtc</span> = <span style="color:#e6db74">"0.13.0"</span>
|
||||
</span></span><span style="display:flex;"><span><span style="color:#a6e22e">gstreamer</span> = { <span style="color:#a6e22e">version</span> = <span style="color:#e6db74">"0.23.0"</span>, <span style="color:#a6e22e">features</span> = [<span style="color:#e6db74">"v1_22"</span>] }
|
||||
</span></span><span style="display:flex;"><span><span style="color:#a6e22e">gstreamer-sdp</span> = { <span style="color:#a6e22e">version</span> = <span style="color:#e6db74">"0.23.0"</span>, <span style="color:#a6e22e">features</span> = [<span style="color:#e6db74">"v1_22"</span>] }
|
||||
</span></span><span style="display:flex;"><span><span style="color:#a6e22e">gstreamer-webrtc</span> = { <span style="color:#a6e22e">version</span> = <span style="color:#e6db74">"0.23.0"</span>, <span style="color:#a6e22e">features</span> = [<span style="color:#e6db74">"v1_22"</span>] }
|
||||
</span></span></code></pre></div><p>That’s not what my issue is though. My issue is that it took me 2-3 hours to discover most of these crates existed!</p>
|
||||
<p>Because there is not centralized list of “these are the plugins, their crates, their included features” and such, when google fails, it becomes
|
||||
almost impossible to unearth them.</p>
|
||||
<p>I’m not saying this as a critique of the Gstreamer ecosystem, but as someone with trauma.</p>
|
||||
<h1 id="now-for-the-meaty-part">Now for the meaty part</h1>
|
||||
<p>So I’ve finally gotten a compiling <a href="https://github.com/Nickiel12/gst-webrtc-example-signaller">(See here)</a>, now to actually implement the signalling server, right!</p>
|
||||
<p>well… Uhm, so. Here’s the thing. I just need to be able to set up a tauri <!-- raw HTML omitted --> tag to point to a video source, right? Wouldn’t it be nice if there was a nice and easy standard for this so I didn’t
|
||||
have to implement my own WebRTC signalling server? Enter WHIP.</p>
|
||||
<h1 id="the-rug-pull">The rug pull</h1>
|
||||
<p>So, yeah. I’m just going to use <a href="https://github.com/bluenviron/mediamtx">MediaMTX</a> until that fails, with a gstreamer rstp source.</p>
|
||||
<p>Bye!</p>
|
||||
|
||||
|
||||
|
||||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
</html>
|
|
@ -199,7 +199,7 @@ It requires two arguements, and has one optional output. You can see the options
|
|||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -7,11 +7,17 @@
|
|||
<loc>https://staticpages.nickiel.net/categories/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/posts/chrono_track/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/posts/webrtc-and-gstreamer/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/posts/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/tags/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/posts/wgu_instructions/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/zig-bytes/zig_bytes_0/</loc>
|
||||
</url><url>
|
||||
<loc>https://staticpages.nickiel.net/zig-bytes/</loc>
|
||||
</url>
|
||||
</urlset>
|
||||
|
|
|
@ -46,7 +46,7 @@
|
|||
|
||||
</main>
|
||||
<footer>
|
||||
<p>Copyright 2023. All rights reserved.</p>
|
||||
<p>Copyright 2024. All rights reserved.</p>
|
||||
|
||||
</footer>
|
||||
</body>
|
||||
|
|
|
@ -4,7 +4,7 @@
|
|||
<title>Tags on Nickiel's Static Pages</title>
|
||||
<link>https://staticpages.nickiel.net/tags/</link>
|
||||
<description>Recent content in Tags on Nickiel's Static Pages</description>
|
||||
<generator>Hugo -- gohugo.io</generator>
|
||||
<generator>Hugo</generator>
|
||||
<language>en-us</language>
|
||||
<atom:link href="https://staticpages.nickiel.net/tags/index.xml" rel="self" type="application/rss+xml" />
|
||||
</channel>
|
||||
|
|
Loading…
Reference in a new issue