Contents
- Basic FFmpeg command notation
- Transcode a stream with FFmpeg
- Transcode an RTSP or RTP IP camera source
- Transcode an MPEG-TS stream
- Transcode a native RTP stream
- Stream over a different outgoing protocol
- Stream over RTMP
- Stream over RTSP/RTP
- Other FFmpeg command line examples
- List available devices
- Convert .ts source files to .mp4
- Synchronize audio and video that start at different timecodes
- Analyze an FFmpeg stream using logs and FFprobe
- Save a live stream as .mp4
- Get encoding properties
Basic FFmpeg command notation
A basic FFmpeg command uses the format
ffmpeg [input-options] -i [input-file] [output-options] [output-stream-URI]
Where:
[input-options] apply to the input, or source, file. For example, you can use -s to specify the size of the file.
[input-file] is the video file or the stream URL.
[output-options] apply to the output, or destination. For example, the -f option specifies the output container format.
[output-stream-URI] is the destination stream URI. The format of the URI depends on the output container format.
Transcode a stream with FFmpeg
Here are a few examples that use FFmpeg to transcode an RTSP, RTP, or MPEG-TS source stream.
Transcode an RTSP or RTP IP camera source
ffmpeg -i "rtsp://[camera-ip-address]/[camera-URI-syntax]" -pix_fmt yuv420p -deinterlace -vf "scale=640:360" -vsync 1 -threads 0 -vcodec libx264 -r 29.970 -g 60 -sc_threshold 0 -b:v 1024k -bufsize 1216k -maxrate 1280k -preset medium -profile: v main -tune film -acodec aac -b:a 128k -ac 2 -ar 48000 -af "aresample=async=1:min_hard_comp=0.100000:first_pts=0" -vbsf h264_mp4toannexb -f mpegts udp://127.0.0.1:10000
Transcode an MPEG-TS stream
ffmpeg -i "udp://localhost:[port]" -pix_fmt yuv420p -deinterlace -vf "scale=640:360" -vsync 1 -threads 0 -vcodec libx264 -r 29.970 -g 60 -sc_threshold 0 -b:v 1024k -bufsize 1216k -maxrate 1280k -preset medium -profile: v main -tune film -acodec aac -b:a 128k -ac 2 -ar 48000 -af "aresample=async=1:min_hard_comp=0.100000:first_pts=0" -vbsf h264_mp4toannexb -f mpegts udp://127.0.0.1:10000
Transcode a native RTP stream
ffmpeg -i "unicast.sdp" -pix_fmt yuv420p -deinterlace -vf "scale=640:360" -vsync 1 -threads 0 -vcodec libx264 -r 29.970 -g 60 -sc_threshold 0 -b:v 1024k -bufsize 1216k -maxrate 1280k -preset medium -profile: v main -tune film -acodec aac -b:a 128k -ac 2 -ar 48000 -af "aresample=async=1:min_hard_comp=0.100000:first_pts=0" -vbsf h264_mp4toannexb -f mpegts udp://127.0.0.1:10000
Stream over a different outgoing protocol
You can deliver FFmpeg streams using RTMP and RTSP/RTP, as shown in the following examples.
Stream over RTMP
To deliver a stream using RTMP, change the output portion of the FFmpeg URL from -f mpegts udp://127.0.0.1:10000 to -f flv rtmp://127.0.0.1/live/myStream.
The RTMP URL must follow the format
rtmp://[wowza-ip-address]:1935/[application]/[stream-name]
Where:
[wowza-ip-address] is the IP address of your Wowza Streaming Engine server.
[application] is the name of your Wowza Streaming Engine application (such as live or vod).
[stream-name] is the name of the stream in the Wowza Streaming Engine application.
Stream over RTSP/RTP
To deliver a stream using RTSP/RTP, change the output portion of the FFmpeg URL from -f mpegts udp://127.0.0.1:10000 to -f rtsp rtsp://127.0.0.1:1935/live/myStream.sdp.
The RTSP URL must follow the format
rtsp://[wowza-ip-address]:1935/[application]/[stream-name]
Where:
[wowza-ip-address] is the IP address of your Wowza Streaming Engine server.
[application] is the name of your Wowza Streaming Engine application (such as live or vod).
[stream-name] is the name of the stream in the Wowza Streaming Engine application.
Other useful FFmpeg commands
List available devices
Get a list of available hardware devices, such as webcams and microphones:
ffmpeg -list_devices true -f dshow -i dummy
Convert .ts source files to .mp4
Convert a .ts file (or any FFmpeg-compatible source) to as .mp4 for VOD playback with Wowza Streaming Engine:
ffmpeg -i input.ts -c:vcopy -c:a copy output.mp4
Synchronize audio and video that start at different timecodes
Wowza Streaming Engine doesn’t use the edit lists that some video recorders use to synchronize audio and video in a .mp4 or .mov container when the audio and video start at different timecodes. Instead, the following FFmpeg commands create separate audio and video renditions of the file in .mkv format and remove a specified number of seconds, in this case 4, from the beginning of the file. Then, the video and audio files are remuxed to create an .mp4 file that is, in this example, 5 seconds shorter. Adjust the number of seconds removed from the file as needed to synchronize your audio and video.
ffmpeg -y -ss 4 -i source.mp4 -c:v copy -an source_video.mkv ffmpeg -y -ss 4 -i source.mp4 -vn -c:a copy source_audio.mkv ffmpeg -y -i source_video.mkv -i source_audio.mkv -c:v copy -c:a copy source_trim.mp4
Analyze an FFmpeg stream using logs and FFprobe
You can study a Wowza Streaming Engine access log file to see how your stream is encoding. A bad keyframe interval can cause the length of chunks (segments) to vary, which can lead to client buffering and connection timeouts.
Here’s an example from a Wowza Streaming Engine access log file that shows a healthy Apple HLS stream. This particular stream has a frame rate is 29.970 fps with a 2-second GOP.
In the example, a = audio frames, v = video frames, and k = keyframes.
a/v/k:469/300/5 duration:10000
a/v/k:471/299/5 duration:10000
a/v/k:469/300/5 duration:10001
How can you tell the stream is healthy? First, the segments are within 1 ms of being the same length. Although the first segment is slightly shorter, typically if the first timecode in the live stream isn’t zero, which would account for the discrepancy. This is expected behavior, but in some edge cases it may cause an issue with the player reconnecting when the stream first starts.
In addition, the GOP (the number of video frames divided by the number of keyframes) is consistently 60. Due to the nature of a noninteger-based GOP there will be some variance in the number of frames in the live stream.
Here’s an example of a live stream whose playback might be suffering:
a/v/k:469/300/5 duration:10000
a/v/k:461/295/5 duration:9843
a/v/k:459/294/8 duration:9810
a/v/k:537/344/14 duration:11478
Not only do the segment lengths vary from 9810 ms to 11478 ms but the group of pictures also varies greatly, from 60 for the first segment to 24.75 for the last. As a result, players may disconnect or buffer out. And if the keyframes don’t align, players will be unable to switch to a different resolution.
In addition to using logs to assess an FFmpeg live encode, you can analyze it using FFprobe.
Save a live stream as .mp4
Save a minute or more of a live stream from Wowza Streaming Engine as a .mp4 file to run through FFprobe:
ffmpeg -i [live-url] -codec copy -f mpegts -y out.ts ffmpeg -i [live-url] -codec copy -f mp4 -y out.mp4
Get encoding properties
ffprobe -show_streams [stream-name]
Kaynak:https://www.wowza.com/docs/how-to-live-stream-using-ffmpeg-with-wowza-streaming-engine
Bir cevap yazın