How to create an ip camera on the basis of the stereo camera is connected to Jetson Nano?

There is a Mini ZED stereo camera, which in itself is an ip camera (the frame from the left sensor is attached to the frame from the right sensor and a single frame of double width). It is connected to Jetson Jetson Nano and Nano with a library of ZED SDK reads frame by frame, as well as information about the distance to pixels. And using a Shader modifies each frame.
Now we need to make on exit from Nano Jetson was such a video stream to the system ZED Mini + Nano Jetson from the outside was perceived as an ip camera, which has URL, which you can read in the video (modified Shader), which removes ZED Mini.
Notice that in Jetson Nano is a good NVIDIA graphics card, in which hardware, you can encode a video stream in h.264 (can be in the form of h.265, but the external device that will accept video feed, runs on Android, so you can only use h.264).
Question: How to find tutorial/tutorial independently to write a program in Ubuntu that will create an ip-camera (on C++ or at least Python)?
I don't understand it and rummage for a long time in the Internet to find descriptions or tutorials on the subject for self-made ip cameras... And generally find nothing. Information either for professionals or for another subject (for example, about the nuances of algorithms/standard h.264). I have examples of programs written using the SDK ZED (https://github.com/stereolabs/zed-examples/tree/ma... https://github.com/stereolabs/zed-examples/tree/ma... but I don't understand how they work. In shaders, too, for a long time trying to understand, but also to no avail. Quantum mechanics at the time, I was given easier:( I Have good programming experience in 1C:Enterprise 8.2, but here is different.
Always thought that information with the help of Google is not difficult to find. But on this issue... some kind of conspiracy of silence. In fact, some experts are ready to help me, but I have to prepare to adequately perceive their advice. That's why I'm looking for tutorial/tutorial.
Jetson Nano in my hand, the camera "under" Raspberry Pi it successfully displays on the monitor. But for stereo camera and use Shader knowledge need a lot more.
Thanks in advance to everyone who responds! If a startup starts (when the device is alive he'll definitely start, as the market is already tested using the test hardware option), all the ushers to me to ask for help (you just have to refer to this discussion).
PS: I Forgot to say. "The root cause" of my interest - the lack of money. So a lot has to do on their own.
April 3rd 20 at 18:43
2 answers
April 3rd 20 at 18:45
Solution
Stream-of-consciousness caught you from the stream of frames should dazzle video stream. If so look at ffmpeg. For jetson nano like there. Output you can output rtsp stream.
April 3rd 20 at 18:47
Yes, exactly, "the stream of frames should dazzle video stream". Just ffmpeg I started to look (on jetson nano it should stand). But Your clarification about the withdrawal of "rtsp stream" is very important. May be it will solve the problem. Thank you so much! Go watch...
However, on the page https://github.com/131/h264-live-player about found the statement that "There is no solution for "real time" mp4 video creation / playback (ffmpeg, mp4box.js, mp4parser - boxing takes time)". The expression "boxing takes time" I translated as "package (i.e., coding) takes time", that is, simply put, slow. For my project delay above 0.3, max 0.5 seconds unwanted. The author of this page, to remove inhibition, has the solution to the system "raspberry pi + raspberry pi camera". On the other hand jetson nano (unlike the raspberry pi) supports hardware encoding that ffmpeg should (in theory) to use. And so braking shouldn't be.
On the third hand, the author of this page has the solution for raspberry pi camera, which I will also try to understand (although unlikely).

Find more questions by tags NVIDIAVideo surveillanceUbuntu