Nginx Live editing


We’ve working on implementing a live editing option based on nginx rtmp. So far we managed to record the stream and save a delayed mp4 file on a specifc path (/opt/kaltura/web/content/recorded).

We want to share the recorded stream as a VOD entry but I have doubts about these points:

  • Can I define a static path for the source file of this entry ? For instance: /opt/kaltura/web/recorded/live.mp4
  • The file grows in size every minute, obviously. Can I open the VOD entry and edit it in order to create additional clips or do I need to call the api to update the metadata with its actual size ?

If we manage te get this setup working, we’l be glad to share our setup with everyone, of course.

Thanks for any help,

David Eusse

1 Like

Hi @david.eusse,

Please see:

Hello Jess,
I went through that post already but I understand that is more oriented to events that are broadcasted and become VOD entries afterwards.

What I’m trying to do is to have a real “Live Editor” where a content uploader can clip a Live Stream and produce a VOD entry on the fly.

I already managed to produce a correct mp4 entry that is been updated every minute and placed on a certain path (like a dropfolder). I do that with an additional exec on the nginx segmenting part.

I would like to assign an entry VOD directly to its path and that’s why I have this two questions:

  • How do I assign the real mp4 path to a VOD entry ?
  • Do I need to update the entry metadata knowing that it grows in size every minute ?

In the end, I just want the content uploader to open the VOD entry and use the standard editing tools. Do you think it would work ?

Since producing the right mp4 took me quite a lot of work I’l be glad to share my setup here.

Thanks for all your help,


Hi David,

If I understand correctly, you’d like to create a Kaltura VOD entry out of the

This is already covered in the existing solution I referred you to in my last response, see:

Only difference is [unless I have misunderstood you] that you’d like to create that said entry WHILE the stream is still active as opposed to AFTER the session ended.
For that, I suggest you create a media entry with a conversation profile that includes only the source [KalturaMediaEntry has a member called conversionProfileId which you can set when calling media->add()] so that the entry’s status will be set to READY as soon as the source ingestion phase is completed and without any transcoding taking place.

Hello Jess,

You are right about the difference. I need to edit the entry WHILE the stream is active in order to extract some segments (It’s a News channel) . The stream is reset every morning at 5AM.

I was about to do what you said and I have already created a different conversion profile with no transcoding (just the source).
I’l go through the information you sent and let you know.



Hello Jess,

I have just one more question:
Will it be fine to have an mp4 entry (source) that changes in size every minute ?
Honestly, I haven’t checked if the file size is stored somewhere with the metadata or if it’s needed in advance or calculated by Kaltura or the KMC when the file is accessed.

The entry source is a proper mp4 produced with a ffmpeg concat command.

Thanks again for your help,


Hi @david.eusse,

First, we must distinguish between size and duration.
A KalturaMediaEntry has a member called msDuration which it inherits from the KalturaPlayableEntry class it extends.
It does not have a size member because it does not have actual media files associated with it. Instead, if has KalturaFlavorAsset records [or objects, depending on your perspective] attached to it. KalturaFlavorAsset has a size member which it inherits from the KalturaAsset class it extends.

Regardless, I’m not sure how you can end up with the wrong size or duration in this scenario. After creating the initial VOD entry with media->add(), if you wish to update the source flavour, you will have to use media->updateContent() and that will take care of updating both members.

Thank you,

You are right. What is relevant is Duration and not Size.

My concern is about the vod clipping app and how it will behave if the file it is editing changes in size every minute. May be I’m going through a wrong path but I don’t see an easier way to implement a “live editing” solution.

We’l continue testing our setup and we’l share any results here.



Hi @david.eusse,

Like I said, whenever you override the existing “in progress” VOD entry, you should do that by calling media->updateContent(). This will take care of updating the msDuration member so there should be no problem in that regard.

Thank you Jess,

I understand what I have to do and I’l inform of my results. Once I get it working I’l share the information on this forum .



Cheers, David.
I look forward to seeing it:)

Me too :slight_smile:

I will start by explaining what we are doing right now.

We are broadcasting 4 live channels and one of them is a News network. The video uploaders are used to clipping the Noon News and upload them immediately without waiting for the internal workflow to provide the clips.

We tried Wowza but apart from this feature, with nginx-rtmp +ffmpeg we get a lot more flexibility (and reliability).
So we started by testing Jess’s approach to Live recordings and found a fundamental difference. It is designed for live events but not for a continuous stream.

So we proceeded this way:

  1. Instead of using the nginx-rtmp native live flv recording, we decided to record directly an mp4 source and make it available as a VOD entry.
  2. We added this exec line to the transcoding part of the live hls setting (it can also be done with an ffmpeg tee):

exec /usr/bin/ffmpeg -i rtmp://localhost:1935/$app/$name -g 60 -vprofile baseline -pix_fmt yuv420p -c:v libx264 -c:a aac -b:v 1600k -b:a 96k -vf “scale=1280x720” -tune zerolatency -preset veryfast -flags +global_header -f segment -segment_time 60 -segment_format_options movflags=+faststart -reset_timestamps 1 /opt/kaltura/web/content/recordedtmp/$name%d.mp4;

If any part of this line is wrong, please let me know. We cheated, of course !

With this, we record and segment the live stream into 1m mp4 segments, knowing that we can’t just use an mp4 file that doesn’t have the correct metadata (moov) to edit or play it.

  1. We setup a cron script that concats the last live segment to a temp mp4 and rewrites the entry source every minute. The recording is delayed but the actual file is a playable mp4.

This is what we have so far:


TOTAL=$(ls -1 $FILE*mp4 |wc -l)
TOPE=$(expr $TOTAL - 2)

if [ $TOTAL -eq 2 ]
cp “$FILE"0.mp4 “$DEST””$FILE".mp4
if [ $TOPE -ge 1 ]
rm file1.lst
echo “file '”$DEST$FILE".mp4’" > file1.lst
echo “file '”$TMPDIR$FILE$TOPE".mp4’" >> file1.lst
ffmpeg -f concat -safe 0 -i file1.lst -c copy “$DEST"temp.mp4
mv “$DEST"temp.mp4 “$DEST””$FILE”.mp4

This performs an old fashioned file count, finds the oldest-1 newsx.mp4 segment and appends it to the entry mp4 source file. We still need to perform some checks like validating that the stream is available and/or skip any discontinuities without overriding the current recording but it works so far.

  1. What I’m missing:
  • Provisioning (via API) the VOD entry pointing to the news.mp4 source and make it available on the KMC.
  • Calling the API for updating the mp4 Duration so it is consistent with the added chunk

We are woking on that and that’s we want to share. I have to say that the nginx-ffmpeg is a lot more reliable by far than the java solution. The tedious lip-sync problems that bothered us all the time have become very rare.

We are working on the last part and any ideas or comments will be appreciated.


Hello all,

I have finally managed to combine a solution where I can record a live stream in order to edit it with a certain delay and upload new episodes on a controlled time table.

I am using the actual flash clipper but I’m running into a problem because it is not possible to fast forward the clip before it has been buffered by the client.

Is there a way to solve or improve this situation ? Will a html5 editor be available for Kaltura CE ?

I’l post our complete setup so anyone can try it, as we managed to make it work (sort of).