
The next 420x360x3 bytes afer that will represent the second frame, etc.

If the video has a size of 420x320 pixels, then the first 420x360x3 bytes outputed byįFMPEG will give the RGB values of the pixels of the first frame, line by line, top to bottom. Now we just have to read the output of FFMPEG. It can be omitted most of the time in Python 2 but not in Python 3 where its default value is pretty small. In sp.Popen, the bufsize parameter must be bigger than the size of one frame (see below). You can use jrottenberg/ffmpeg or jrottenberg/ffmpeg:3.3 to get the latest build based on ubuntu. The format image2pipe and the - at the end tell FFMPEG that it is being used with a pipe by another program. This image can be used as a base for an encoding farm.


In the code above -i myHolidays.mp4 indicates the input file, while rawvideo/rgb24 asks for a raw RGB output. Import subprocess as sp command = pipe = sp.
