├── LICENSE.txt
└── README.md
/LICENSE.txt:
--------------------------------------------------------------------------------
1 | Copyright @readwithai, 2025
2 |
3 | By default copyright law gives you no right to copy or share material apart from through fair use.
4 | I as the author of this piece am giving you (fairly broad) permission to copy and modify this material through this license but want to maintain attribution to me and some form of "advertisement" of other parts of my work... which I want because I want to influence the world and need attention and money to do so! This in general is necessary to incentivise and allow me to do the work I want to do.
5 |
6 | You can copy and distribute this document without modification as much as you want.
7 |
8 | If you modify this document and distribute it you must:
9 | i. Indicate that it has been modified at the beginning of document
10 | ii. Include a link to the original document by @readwithai a the beginning of the document
11 | iii. Maintain header and the beginning of the document with its original links to social media.
12 | iv. Maintain the footer about @readwithai at the bottom together with it links.
13 |
14 | You may modify the header and footer to indicate that the document was originally by readwithai and has been modified.
15 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # A beginner's Cookbook for FFmpeg
2 | **@readwithai** - [**X**](https://x.com/readwithai) - [**blog**](https://readwithai.substack.com/) - [**machine aided reading**](https://www.reddit.com/r/machineAidedReading/comments/1jv7zlw/welcome_to_machine_assisted_reading/)
3 |
4 | This is a beginner's cookbook for the audio-video command-line tool, [FFmpeg](https://www.ffmpeg.org/). It builds up the reader's knowledge of FFmpeg's features through examples, before providing more applied examples which link back to early examples to aid understanding and adaptation. You may now want to jump to the [introduction](#introduction).
5 |
6 | I created this guide because I found working out how FFmpeg worked from snippets too difficult and needed to form an understanding from first principles. FFmpeg already has complete reference documentation, but this can be difficult to understanding. By adding concrete examples of features, these features become easier to understand, by giving the user something they can run they have something working that they can adapt, by linking to simpler examples the user can learn on the simple example sand then apply the more complicated features, by providing documentation that links to examples the user is given an opportunity to understand a feature through documentation and then continue.
7 |
8 | This guide provides is an [index of features](#features) which goes through different FFmpeg features linked to recipes. FFmpeg has more features than a cookbook can cover completely but provides the ability to [list filters](#list) and [display their parameters](#parameters) as well as providing [terse but useable reference documentation](https://ffmpeg.org/ffmpeg-filters.html). I try to label links with `(example)` if they link to an earlier example of a feature or `(doc)` if they link to external documentation.
9 |
10 | > *I wrote all of this if it seems intersting enough and you'd like to ask me some questions. Then I'm open to doing some [**FFmpeg tutoring**](https://readwithai.substack.com/p/online-ffmpeg-tutoring) on the topic. I could walk through these commands, or apply this approach to your problems.
11 |
12 | # Attribution
13 | Thanks to [Ventz Petkov](https://github.com/ventz) for providing me with [various additional recipes](https://github.com/talwrii/ffmpeg-cookbook/issues/1). This is particularly valuable as I am not (at the time of writing) an expert in FFmpeg.
14 |
15 | If you think a particularly recipe belongs in this guide, feel free to [add it as an issue](https://github.com/talwrii/ffmpeg-cookbook/issues). It may take me a little time for me to add recipes as I want to keep the guide accessible to beginners, so bear this in mind when considering the whether submitting me recipes is worth your time!
16 |
17 |
18 | # Alternatives and prior work
19 | There are a few other cookbooks. [ffmprovisr](https://amiaopensource.github.io/ffmprovisr/) is quite complete, but doesn't feel like it covers the initial parts of understanding FFmpeg.
20 | [ffmpeg by example](https://ffmpegbyexample.com/) is a website which focuses on examples, but does cover initial use.
21 |
22 | haileys has [the beginnings of a cookbook](https://github.com/haileys/ffmpeg-cookbook), as does [Lorna Jane](https://lornajane.net/posts/2014/my-ffmpeg-cookbook) but these only a few examples. There is [quite complete reference documentation on filters](https://ffmpeg.org/ffmpeg-filters.html#Filtering-Introduction).
23 |
24 | There are [various books](https://trac.ffmpeg.org/wiki/BooksAndOtherExternalResources) on FFmpeg, including one which is [available for free](https://github.com/jdriselvato/FFmpeg-For-Beginners-Ebook) online
25 |
26 | The [FFmpeg wiki](https://trac.ffmpeg.org/) contains some examples particularly the [filtering section](https://trac.ffmpeg.org/#Filtering). You can also [ask questions on reddit](https://www.reddit.com/r/ffmpeg).
27 |
28 | What distinguishes this cookbook is that is is available for free, focuses on good internal and external linking, tries to use an innovative approach to ordering of material and focuses on examples that can be immediately run without any media.
29 |
30 |
31 | # Introduction: The core of FFmpeg
32 | `ffmpeg` has a number of core features that when understood help you do a range on things. This introduction tries to cover most of these core features. If you cannot understand the FFmpeg reference documentation or an example that you have found on the internet sufficiently to use it, I would suggest working through this guide or the feature closest to what you want to do. But don't worry - later examples will link back to this introduction.
33 |
34 | Once you have read this (or if you want to jump ahead) you might like to look at some [specific topics](#topics) or search for a recipe.
35 |
36 |
37 |
38 |
39 |
40 |
41 | ## Create 10 seconds of silence
42 | ```bash
43 | ffmpeg -filter_complex 'anullsrc=duration=10s [out]' -map '[out]' -ac 1 silence.wav
44 | ```
45 | You can play the silence you have created with `ffplay silence.wav`.
46 |
47 | This uses the [source filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#toc-Filtering-Introduction), [anullsrc (doc)](https://ffmpeg.org/ffmpeg-filters.html#anullsrc) which generates ten seconds of silence (specified through the `duration` parameter) and then writes this to a stream labelled '[out]'. The `-map` command specifies stream labelled `[out]` is used in the output.
48 |
49 | You can also run this as: `ffmpeg -filter_complex 'anullsrc=duration=10s' -ac 1 silence.wav` and an the output of the filter is implicitly used as the stream for output.
50 |
51 | There are other ways to create filters such as [-af](#audio-filter) or [-vf](#video-filter) but both of these require precisely one input and one output. In general, you can also use [`-filter_complex`](#filter-complex) instead of these filters.
52 |
53 |
54 |
55 |
56 |
57 |
58 |
59 |
60 |
61 | ## Play a sine wav at 256 hertz for ten seconds
62 | ```bash
63 | ffplay -f lavfi 'sine=frequency=256:duration=10'
64 | ```
65 |
66 | This command specifies that the format of the input is a `libavformat` filter ("LibAVFIlter"). `ffplay` does not have a `-filter_complex` argument but you can use this instead for complex filters. `-f lavfi` can also be used with wht `ffmpeg` - but you must use `-i` to specify the input filter.
67 |
68 | The parameters for the source filter, [`sine`(doc)](https://ffmpeg.org/ffmpeg-filters.html#sine), are initiated with an `=` and then separated by `:`.
69 |
70 | You do not need to use ffplay to play instead, you can [use the xv output device](#xv). This can be useful if you have multiple input sources - which `ffplay` does not accept.
71 |
72 | > **See also**: [create sine wave with aevalsrc](#aevalsrc-sine), [create arbitrary waveforms](#aevalsrc), [output to an GUI window](#xv)
73 |
74 |
75 | ## Write a sine wave to a file
76 | ```bash
77 | ffmpeg -filter_complex 'sine=frequency=256:duration=10' sine.wav
78 | ffplay sine.wav
79 | ```
80 | Note how this compares to [the previous example](#sine-play). This shows how you can convert between filters that play media and those that create a file containing the media.
81 |
82 | > **See also**: [a sine wave with aevalsrc](#aevalsrc-sine), [an arbitrary waveform](#aevalsrc)
83 |
84 |
85 |
86 |
87 | ## Combine two sine waves into a chord.
88 | ```bash
89 | ffplay -f lavfi 'sine=frequency=256:duration=10s [one]; sine=frequency=512:duration=5s [two]; [one][two]amix'
90 | ```
91 | Here we create two streams, `[one]` and `[two]` using the [sine filter (example)](#sine-play). We mix these together into the final output streamm using the [amix filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#amix). `amix` takes two streams as input and mixes them together into a single output.
92 |
93 | If you look at the entry for amix in [ffmpeg -filters](#list-filters) with `ffmpeg -filters | grep amix` you will see that it you will see that it is of type `N->A` meaning it takes a number of steams as input and writes them to a single audio stream.
94 |
95 | > **See also**: [ffplay vs ffmpeg (examples)](#ffplay), [Combing video streams (example)](#hstack)
96 |
97 |
98 |
99 | ## List available filters
100 | ```bash
101 | ffmpeg -filters
102 | ```
103 |
104 | This command lists all available [filters (example)](#filters-example) together with some information about each filter. The output provides various information about the filter, `|->A` means a filter is an audio source, `A->A` means that the filter takes an audio stream as [input (example)](#audio-filter) and produces one as [output (example)](#audio-source). Similarly, `V` indicates [video input](#video-filter) or [video output (example)](#video-source).
105 |
106 | Filters are listed in the filters [manual page](#man-pages): `man ffmpeg-filters`.
107 |
108 | > **See also:** [About filters](#filters-abstract), [Show a filter's parameters (example)](#parameters)
109 |
110 |
111 |
112 |
113 | ## Show the parameters for a filter
114 | ```bash
115 | ffmpeg --help filter=color
116 | ```
117 |
118 | This command displays information about the [color (example)](#solid-color) such as which parameters can be used, their default values, [their order](#omit-names) and [what values can be given (example)](#expressions-example).
119 |
120 | In the output, some parameters are marked with a `T`, which indicates that they can be [modified through commands (example)](#commands-example).
121 |
122 | You can also get help for other objects such as a `decoder`, `encoder`, `demuxer`, `muxer`, `bsf` (bit stream filter), or `protocol`. See the [`man ffmpeg`](#man-pages) for details.
123 |
124 | Unfortunately, some information about [filters (example)](#output-filter) such as which parameters can use [expressions (example)](#expression-example) and the variables which can be used in the expressions is not shown in this documentation, but must be found in [reference documentation](#documentation).
125 |
126 | > **See also**: [An example of using parameters (example)](#parameters-example), [Expressions (example)](#expression-example), [FFmpeg documentation](#documentation)
127 |
128 |
129 | ## Decrease the volume of an audio file
130 | ```bash
131 | ffmpeg -filter_complex `sine=frequency=512:duration=10` sine.wav
132 | ffmpeg -i sine.wav -af 'volume=volume=0.5' quiet-sine.wav
133 | ffplay sine.wav
134 | ffplay quiet-sine.wav
135 | ```
136 |
137 | This creates a file containing a [sine wav (example)](#sine-play) and then uses the [`volume` filter](https://ffmpeg.org/ffmpeg-filters.html#volume) to decrease the volume to 50% of the original volume by using `volume=0.5`. Because we are using a filter that has precisely one input and output, we can use `-af` rather than `-filter_complex`, but `-filter_complex` would still work. We could use `-filter:a` instead of `-af` which is an identical option which may be easier to understand.
138 |
139 | We then [play both the files](#play) the result wwith the `ffplay` program.
140 |
141 | > **See also:** [ffplay vs ffmpeg (example)](#play), [Video filters (example)](#video-filter)
142 |
143 |
144 |
145 |
146 | ## Concatenate two audio files together
147 | ```bash
148 | ffmpeg -filter_complex 'sine=frequency=512:duration=2' sine1.wav
149 | ffmpeg -filter_complex 'sine=frequency=256:duration=5' sine2.wav
150 | ffmpeg -i sine1.wav -i sine2.wav -filter_complex '[0][1]concat=v=0:a=1' -ac 1 concat.wav
151 | ffplay concat.wav
152 | ```
153 |
154 | We [create two sine waves](#write-sine) like in the previous recipes. We then combine together the first (`[0]`) and second (`[1]`) inputs with the `concat` filter. We must specify that concat outputs no video streams and one audio stream because these are not the defaults: concat works with one video stream by default.
155 |
156 |
157 | Here is the documentation for `concat`:
158 |
159 | ```
160 | > ffmpeg --help filter=concat
161 | Filter concat
162 | Concatenate audio and video streams.
163 | Inputs:
164 | dynamic (depending on the options)
165 | Outputs:
166 | dynamic (depending on the options)
167 | concat AVOptions:
168 | n ..FVA...... specify the number of segments (from 1 to INT_MAX) (default 2)
169 | v ..FV....... specify the number of video streams (from 0 to INT_MAX) (default 1)
170 | a ..F.A...... specify the number of audio streams (from 0 to INT_MAX) (default 0)
171 | unsafe ..FVA...... enable unsafe mode (default false)
172 | ```
173 |
174 |
175 |
176 | ## Play ten seconds of red
177 | ```bash
178 | ffplay -f lavfi color=color=red:duration=10s
179 | ```
180 |
181 | Here we use the [color filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#allrgb_002c-allyuv_002c-color_002c-colorchart_002c-colorspectrum_002c-haldclutsrc_002c-nullsrc_002c-pal75bars_002c-pal100bars_002c-rgbtestsrc_002c-smptebars_002c-smptehdbars_002c-testsrc_002c-testsrc2_002c-yuvtestsrc) together with the color name of `red` and a `duration` of 10s.
182 |
183 | You can also run this as `ffplay -f lavfi color=red:duration=10s` since color is a default parameter.
184 |
185 | You can list the available colors with `ffmpeg -colors`.
186 |
187 |
188 | > **See also**: [ffplay vs ffmpeg](#play)
189 |
190 |
191 |
192 | ## Create an image consisting of solid red
193 | ```bash
194 | ffmpeg -filter_complex 'color=color=red' -frames:v 1 red.png
195 | ffplay red.png
196 | ```
197 | Here we specify that we are reading one frame with `-frames:v 1`.
198 |
199 | > **See also**: [Working with images](#images), [Extract a frame](#ts-frame)
200 |
201 |
202 | ## Write ten seconds of red to a file
203 | ```bash
204 | ffmpeg -filter_complex 'color=color=red:duration=10s' red.mp4
205 | ```
206 |
207 | We use the [color filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#allrgb_002c-allyuv_002c-color_002c-colorchart_002c-colorspectrum_002c-haldclutsrc_002c-nullsrc_002c-pal75bars_002c-pal100bars_002c-rgbtestsrc_002c-smptebars_002c-smptehdbars_002c-testsrc_002c-testsrc2_002c-yuvtestsrc) together with the duration parameter of ten seconds and the color parameter of red.
208 |
209 | We use `-filter_complex` because there is no input file since `color` is a source filter. We specify the output files as `red.mp4`.
210 |
211 |
212 |
213 | ## Create a video that fades between two colours
214 | ```bash
215 | ffplay -f lavfi 'color=color=red:duration=5s [red]; color=color=blue:duration=5s [blue]; [red][blue]xfade=duration=5s'
216 | ```
217 |
218 | We create two streams, one consisting of solid red, the other of solid blue. We then feed both these streams into the `xfade` filter which "cross-fades" between the two over 5 seconds.
219 |
220 | > **See also**: [ffplay vs ffmpeg](#play)
221 |
222 |
223 |
224 |
225 |
226 | ## Create a video that counts up to 10
227 | ```bash
228 | ffplay -f lavfi -i 'color=color=white, drawtext=text=%{pts}'
229 | ```
230 |
231 | This uses the [drawtext filter's](https://ffmpeg.org/ffmpeg-filters.html#drawtext-1) expression language to show the current timestamp in each frame. `%{pts}` is [expanded](https://ffmpeg.org/ffmpeg-filters.html#Text-expansion) in each frame to the number of seconds into the frame.
232 |
233 | See the [section on positioning text](#positioning)
234 |
235 |
236 |
237 |
238 | ## Place text in the middle of the screen
239 | ```bash
240 | ffplay -f lavfi -i 'color=color=white, drawtext=text=hello:x=(main_w - text_w) / 2:y=(main_h-text_h) /2'
241 | ```
242 |
243 | We [render text using the drawtext filter](#text) as we did in this previous recipe. But we also use the `x` and `y` parameters that accept [drawtext's own expression language(doc)](https://ffmpeg.org/ffmpeg-filters.html#Syntax).
244 |
245 | Note that we the division operation, `/`, in the expression. `main_w` represents the video width, `main_h` its height, `text_w` the rendered text's width and `text_h` its height.
246 |
247 | In the `x` parameter for `drawtext`, [expressions](#expressions-topic) are evaluated using [ffmpeg's own expression language(doc)](https://ffmpeg.org/ffmpeg-utils.html#toc-Expression-Evaluation) which provides various functions (`sin`, `cos`, etc) and operations (`*`, `/`, `-`, etc).
248 |
249 | > **See also**: [ffplay vs ffmpeg](#play)
250 |
251 |
252 |
253 |
254 | ## Capture the screen using Linux
255 | ```bash
256 | ffplay -f x11grab -i ''
257 | ```
258 |
259 | This example works on [Linux](#programming) but there are [analogs for other operating systems](https://trac.ffmpeg.org/wiki/Capture/Desktop)
260 |
261 | This [section of the wiki of the ffmpeg wiki](https://trac.ffmpeg.org/wiki/Capture/Desktop) describes how to capture on different systems. You can also [capture a specific window](#capture-window), or a [specific region](#capture-region).
262 |
263 | [x11grab](https://www.ffmpeg.org/ffmpeg-devices.html#x11grab) supports various options which can for example be used to capture a section of the screen.
264 |
265 | > **See also**: [Recording your computer](#computer), [Record a window](#capture-window), [Record a region of your screen](#capture-region), [list devices](#devices), [ffplay vs ffmpeg](#play)
266 |
267 |
268 | ## Change the size of a video
269 | ```bash
270 | ffmpeg -filter_complex 'color=color=white:size=1024x512:duration=5s, drawtext=text=%{pts}:fontsize=h' count.webm
271 | ffmpeg -i count.webm -filter_complex 'scale=512:256' count-resized.webm
272 | ffplay count-resized.webm
273 | ```
274 |
275 | First we create an image with a known size of 1024x512 which [displays the time on the video (example)](#timestamp) using drawtext. We then use the [scale filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#scale-1) to scale to a given size. Here we [omit the names (example)](#omit-name) of parameters of the `width` and `height` parameters of scale
276 |
277 | You can replace one of the `width` or `height` parameters with `-1` to make FFmpeg calculate this parameter based on [aspect ratio](#video-glossary).
278 |
279 | > **Reference**: The FFmpeg wiki has a [section on scaling](https://trac.ffmpeg.org/wiki/Scaling)
280 |
281 |
282 |
283 | ## Create a video with the phrase "hello world" on a white background
284 | ```bash
285 | ffmpeg -filter_complex 'color=color=white:duration=10s, drawtext=text=hello:fontsize=20' out.mp4
286 | ```
287 |
288 | `drawtext` takes a video stream as input and draws text on top of it. Note the use of `,` to feed the output of one stream into the next.
289 |
290 | We could, however, have used labels instead as we did before when [concatenating audio streams](#concat), like so:
291 |
292 | ```bash
293 | ffmpeg -filter_complex 'color=color=white:duration=10s [one]; [one] drawtext=text=hello:fontsize=20' out.mp4
294 | ```
295 |
296 |
297 |
298 | ## Trim a video to five seconds
299 | ```bash
300 | ffmpeg -filter_complex 'color=color=white, drawtext=text=%{pts}, trim=duration=10s' count-to-ten.mp4
301 | ffmpeg -i count-to-ten.mp4 -vf 'trim=duration=5s' count-to-five.mp4
302 | ```
303 |
304 | First we create a video which shows the timestamp each frame and is ten seconds long. We then apply the `trim` filter to limit this to ten seconds.
305 |
306 | Here we can use the `-vf` because our filter has precisely one input and one output - but we could equally well use [-filter_complex (example)](#filter-complex) instead.
307 |
308 |
309 | ## Turn a static image into a five second video of the image
310 | ```bash
311 | ffmpeg -filter_complex 'color=color=white, drawtext=text=hello:fontsize=40' -v:frame 1 static.png
312 | ffmpeg -i static.png -filter_complex 'nullsrc=duration=5s [vid]; [vid][0]overlay' out.mp4
313 | ```
314 |
315 | First we [create an image](#image) using the `-v:frame` option. We then create a 5 second empty video using the `nullsrc` filter and overlay the static image on it.
316 |
317 | ## Concatenate two videos together
318 | ```bash
319 | ffmpeg -filter_complex 'color=color=white:duration=5s, drawtext=text=%{pts}' count-up.mp4
320 | ffmpeg -filter_complex 'color=color=white:duration=5s, drawtext=text=%{pts}, reverse' count-down.mp4
321 | ffmpeg -i count-up.mp4 -i count-down.mp4 -filter_complex '[0][1]concat' concat.mp4
322 | ```
323 |
324 | First we create two videos one that [counts up to ten seconds](#timestamp), and then the reverse of this using the [reverse (doc)](https://ffmpeg.org/ffmpeg-filters.html#reverse) filter.
325 |
326 | We then use the concat filter (as we did for [audio](#concat)) but this time we do not need to use concat - since the default output is a single video stream and no audio stream.
327 |
328 | ## Add a static introduction image to the beginning of a video
329 | ```bash
330 | ffmpeg -filter_complex 'color=color=white, drawtext=text=hello:fontsize=40' -v:frame 1 static.png
331 | ffmpeg -filter_complex 'color=color=red:duration=5s [red]; color=color=green:duration=5s [green]; [red][green]xfade=duration=5s' video.mp4
332 | ffmpeg -i static.png -i video.mp4 -filter_complex 'nullsrc=duration=2s, [0]overlay [intro]; [intro][1]concat' with-title.mp4
333 | ```
334 |
335 | First we create a static image with the word hello and a video which fades from red to green. Then we combine the two [overlaying the static image](#static-image) on a 2 second video and call it intro. This is concatenated with a video.
336 |
337 | ## Add a logo to the corner of a picture
338 | ```bash
339 | ffmpeg -filter_complex 'color=white:size=100x100, drawtext=text=L:fontsize=h' -frames:v 1 logo.png
340 | ffmpeg -filter_complex 'color=red [red]; color=blue [blue]; [red][blue]xfade=duration=5s, trim=duration=5s' video.webm
341 | ffmpeg -i video.webm -i logo.png -filter_complex '[0][1]overlay=x=W-w:y=H-h' final.webm
342 | ffplay final.webm
343 | ```
344 |
345 | We [create an image (example)](#image) consisting of the letter L, [scaled (example)](#scale-text) to the size of the image. We then create an [image fading between colours (example)](#fade). These act as input for our main filter.
346 |
347 | We then input both these files into FFmpeg. We must use `-filter_complex` because we have two inputs.
348 |
349 | We place the logo over the image using the overlay filter by specifying the `x` and `y` parameter. These are [expressions](#expressions) using the variables `W`, `w`, `H` and `h` for the weights and heights of the first and second images.
350 |
351 | ## Add music to a video
352 | ```
353 | ffmpeg -filter_complex 'color=blue [blue]; color=yellow [yellow]; [blue][yellow]xfade=duration=10s, trim=duration=10s' fade.webm
354 | ffmpeg -filter_complex 'sine=frequency=32+40*t:eval=true
355 | ```
356 |
357 | ## Make a black and white version of the screen
358 | ```bash
359 | ffmpeg -f x11grab -i '' -vf 'hue=s=0, format=yuv420p' -f xv ''
360 | ```
361 |
362 | Here we take the [input the screen](#xv)
363 |
364 |
365 |
366 | ## Change color of a frame over time
367 | The `sendcmd` filter allows you to change parameters at other filters at particular timestamps ([amongst other things (doc)](https://ffmpeg.org/ffmpeg-filters.html#sendcmd_002c-asendcmd))
368 |
369 | ```
370 | ffplay -f lavfi -i color@x, sendcmd=commands=1 color@x color blue; 2 color@x color green; 3 color@x color purple
371 | ```
372 |
373 | `color@x` creates a color filter with the name `color@x` which can be used to distinguish different filters.
374 |
375 | For certain changes over time, particularly "continuous" changes you can use [expressions with the `t` parameter](#time-expressions) including using the [if expression](#if-expression) - but some commands do not support [expressions](#expressions-topic) and for discrete changes sendcmd can be more useful.
376 |
377 | Note that [audio filters](#asendcmd) and video filters use a different filter to change commands.
378 |
379 | > **See also**: [Section on commands](#commands-topic)
380 |
381 |
382 |
383 | ## Change the frequency of an audio bandpass
384 | ```
385 | ffplay -f lavfi -i "aevalsrc=exprs=0.2 * gte(sin(128 * t * 2 * PI)\,0), bandpass=frequency=250, asendcmd=commmands='2 bandpass frequency 100'"
386 | ```
387 |
388 | This example is similar to the [previous one](#command-example) but for an audio filter rather than a video filter.
389 |
390 | We create a square wave because these [has a lot of overtones (wiki)](https://en.wikipedia.org/wiki/Subtractive_synthesis), using the [aevalsrc filter (example)](#aevalsrc) with [an expression](#expression) that allows one to specific a formula, that we send through a bandpass filter. This waveform is then sent into [asendcmd (doc)](https://ffmpeg.org/ffmpeg-filters.html#sendcmd_002c-asendcmd) which has a command to change the frequency of the bandpass signal after two seconds.
391 |
392 | Unfortunately, FFmpeg requires you to use a different filter for sending commands to audio filters than video filters, so we use [asendcmd (doc)](https://ffmpeg.org/ffmpeg-filters.html#sendcmd_002c-asendcmd) rather than [sendcmd (example)](#command-example).
393 |
394 | > **See also**: [Audio engineering](#audio)
395 |
396 |
397 | ## List available devices
398 | ```bash
399 | ffmpeg -devices
400 | ```
401 |
402 | `D` means an input device. (demuxer). `E` means an output device (muxer).
403 |
404 | You can see more information about devices with [`man ffmpeg-devices`](#man-pages).
405 |
406 | > **See also**: [xv device for video output](#xv), [x11grab device for video input](#x11grab)
407 |
408 |
409 | ## Create a grid of 16 frames taken from the video
410 | ```
411 | ffmpeg -filter_complex 'color=white, drawtext=text=%{pts}:x=w/2 + 0.6*w/2*sin((t / 8) * 2 * PI):y=h/2 + 0.6 * h/2*cos((t/8) * 2 * PI), drawbox=w=iw:h=ih:color=red, trim=duration=8s' orbit.mp4
412 | ffmpeg -i orbit.mp4 -r 2 orbit-%02d.png
413 | ls orbit-*.png | sed 's/^/-i /' | xargs ffmpeg -y -filter_complex '[0][1][2][3]hstack=4 [row1]; [4][5][6][7] hstack=4[row2]; [8][9][10][11]hstack=4[row3]; [12][13][14][15]hstack=4 [row4]; [row1][row2][row3][row4] vstack=4' -f apng grid.png
414 | ffplay grid.png
415 | ```
416 | We first create an interesting (or sufficiently interesting for our cases here) image of the current timestamp moving in a circle. We then extract out two frames a second by setting the [rate parameter, -r, (doc)](https://ffmpeg.org/ffmpeg.html#toc-Video-Options) to 2 frames per second and outputting each frame to file by using a filename of `orbit-%02d.png` in the name. This is a format string for the [printf](#programming) function in the programming language. the %02d means that e.g. the 9th frame is wring to the file `orbit-09.png`.
417 | In the next expression we use some command-line magic to run `ffmpeg -i orbit-01.png -i orbit-02.png ...` before horizontally stacking all of the images into rows with [hstack (example)](https://ffmpeg.org/ffmpeg-filters.html#hstack) and vertically stacking these rows with vstack. Each of these commands takes four [inputs (example)](#two-inputs). We specify the [default parameter (example)](#omit-names) `inputs` of each to 4.
418 |
419 |
420 | ## Finishing up
421 | Hopefully this introduction has introduced to a broad range of how ffmpeg works and what it can do through examples. The rest of the guide covers topics in the style of a more traditional cookbook.
422 |
423 | Here are some topics covered
424 |
425 | * [Recording your computer](#recording)
426 | * [Formatting text](#text)
427 | * [Audio engineering](#audio)
428 | * [Interesting outputs from FFmpeg](#text)
429 | * [Controlling FFmpeg externally](#external)
430 |
431 | We also cover some more topics of Thor internals of ffmpeg - while linking to examples.
432 |
433 | * [Using commands](#commands)
434 | * [Writing more readable filters](#readability)
435 | * [Getting information about FFmpeg](#reflection)
436 |
437 |
438 |
439 |
440 |
441 |
442 | # Recording your computer
443 |
444 |
445 | ## Record a specific window using Linux
446 | ```bash
447 | xwininfo
448 | ffplay -window_id 0x520003e -f x11grab -i ''
449 | ```
450 |
451 | This example works on [Linux](#programming), but there are [similar filters for different operating systems](https://trac.ffmpeg.org/wiki/Capture/Desktop).
452 |
453 | This uses the `-window_id` option which is an option for the x11grab [device](#devices) to record a specific window [on your screen](#capture-screen).
454 |
455 | There are [various options](https://www.ffmpeg.org/ffmpeg-devices.html#toc-Options-20) that allow you to select a specific region (`grab_x`, `grab_y`, `video_size`), follow the mouse (`-fillow_mouse`), hide the mouse (`-draw_mouse`), and other things (`framerate`, `show_region`, `framerate`).
456 |
457 | ## Record a specific region of the screen
458 | ```bash
459 | ffmpeg -f x11grab -select_region 1 -i '' region.mp4
460 | # ffmpeg -f x11grab -i '' -select_region 1 region.mp4 - this does not work
461 | ffplay region.mp4
462 | ```
463 |
464 | This uses [x11grab](#capture-screen) to capture a region selected with the cursor when run (specified by [the option](https://ffmpeg.org/ffmpeg-devices.html#Options-20) `-select_region 1`). Note that `-select_region 1` must come before `-i`
465 |
466 | > **See also**: [Record a specific window](#specific-window), [record the screen](#capture-screen), [List devices](#devices)
467 |
468 | # Cutting, combining and formatting
469 | ## Combining video and audio
470 | ```
471 | ffmpeg -filter_complex 'aevalsrc=pow(sin(128 * t * 2 * PI)\, sin(t) * sin(t)), atrim=duration=10s' warble.wav
472 | ffmpeg -filter_complex 'color=white [white]; color=black [black]; [white][black]xfade=duration=10s, trim=duration=10s' transition.webm
473 | ffmpeg -i warble.wav -i transition.webm combined.mp4
474 | ffplay combined.webm
475 | ```
476 |
477 |
478 |
479 | ## Combine two videos with different sizes
480 | ```
481 | ffmpeg -filter_complex 'color=red:size=200x100, trim=duration=10s' red.mp4
482 | ffmpeg -filter_complex 'color=blue:size=300x400, trim=duration=10s' blue.mp4
483 | ffmpeg -i red.mp4 -i blue.mp4 -filter_complex '[0][1]xstack=layout=0_0|w0_0:fill=black' combined.mp4
484 | ```
485 | Here we create two videos [consisting of solid colour (example)](#solid-color) which are ten seconds long but with different sizes. The first is red and 200 pixels wide and 100 pixels wide. The second is blue and 300 pixels wide and 400 pixels high.
486 |
487 | We then combine these together with the [xstack (doc)](https://ffmpeg.org/ffmpeg-filters.html#xstack-1). `xstack` works by drawing videos into an imaginary canvas. The `layout` parameter specifies the top left coordinate of the rectangle for each source using the variables `w0`, `h0`, `w1`, `h1` ... etc for the width and heights of images. Somewhat confusingly the indexes start at 0 rather than 1 (this is referred to as [zero-based numbering](#zero-index)).
488 |
489 | So this example the first image is display at `(0, 0)` and the second image is placed to the right of the image but at the same height at `(first_source_width, 0)`.
490 |
491 | If we wanted to reverse this so that the second image is drawn first then the second image we could use the following expression:
492 |
493 | `ffmpeg -i red.mp4 -i blue.mp4 -filter_complex '[0][1]xstack=layout=w1_0|0_0:fill=black' combined.mp4`
494 |
495 | If images have matching dimensions you can use the [hstack (example)](#hstack) or [vstack (doc)](https://ffmpeg.org/ffmpeg-filters.html#vstack-1) filters for this.
496 |
497 | > **Reference**: The FFmpeg wiki has [a section on xstack](https://trac.ffmpeg.org/wiki/Create%20a%20mosaic%20out%20of%20several%20input%20videos%20using%20xstack) together with diagrams.
498 |
499 | > **See also**: [Pixel](#av-concepts), [coordinates](#av-concepts)
500 |
501 |
502 | # Formatting text
503 | ## Placing text in the middle of the screen
504 | See the [earlier example](#middle)
505 |
506 |
507 |
508 | ## Placing text at different positions
509 | ```bash
510 | ffplay -f lavfi -i 'color=color=white, drawtext=text=bottom-right:x=main_w - text_w:y=(main_h-text_h), drawtext=text=top-left:x=0:y=0, drawtext=text=top-right:x=main_w-text_w:y=0, drawtext=text=bottom-left:x=0:y=main_h - text_h'
511 | ```
512 |
513 | This recipe renders text in various positions. See the [previous recipe](#middle) for the meaning of expressions.
514 |
515 | > **See also**: [ffplay vs ffmpeg](#play)
516 |
517 |
518 | ## Scaling text to the size of the video
519 |
520 | ```bash
521 | ffplay -f lavfi 'color=color=white:size=500x100, drawtext=text=hello:fontsize=main_h'
522 | ```
523 |
524 | Here we use an expression in `fontsize` using the `main_h` variable so the font is the size of the screen. We also use the `size` parameter in color so that the full string fits in the window.
525 |
526 |
527 | ## Draw bold text
528 | [drawtext (doc)](https://ffmpeg.org/ffmpeg-filters.html#drawtext-1) offloads the styling of fonts to the [fontconfig library (doc)](https://www.freedesktop.org/wiki/Software/fontconfig/) or a font file that you give it. This means you need to understand ["Font Matching" and "Font Patterns"](https://fontconfig.pages.freedesktop.org/fontconfig/fontconfig-user.html) a little to use fonts.
529 |
530 | `fc-match` and `fc-match -a` can be good to search fonts and show the font that will be applied.
531 |
532 | ```
533 | ffplay -f lavfi 'color=white, drawtext=text=hello:fontfile=\\:style=Bold:fontsize=h'
534 | ```
535 |
536 | This query uses a fontconfig generic query string `:style=Bold` in the `fontfile` field. We need to escape the colon with two backslashes. We use the draw text filter (example) as before
537 |
538 |
539 | ## Italic text
540 | ```
541 | ffplay -f lavfi 'color=white, drawtext=text=hello:fontfile=\:style=Italic:fontsize=h'
542 | ```
543 |
544 | This query uses a fontconfig generic query [like the bold examples (example)](#bold) together with the [drawtext filter to draw text (example)](#text) but the size it Italic
545 |
546 | ## Draw text in serif font
547 | ```
548 | ffplay -f lavfi 'color=white, drawtext=text=hello:fontfile=Serif:fontsize=h'
549 | ```
550 | Again we [use a fontconfig query (example)](#bold). This time we search for the font name.
551 |
552 |
553 | # Working with images
554 | FFmpeg provides filters to operate on video streams and audio streams and to combine these streams together into files. However, because an image is a video containing one frame FFmpeg can also operate on images.
555 |
556 | Some of the image based functionality which FFmpeg provides is analogous to the [Imagemagick](https://imagemagick.org/script/command-line-processing.php). Imagemagick only operates on images, so if you learn FFmpeg then what you have learned will also apply to videos. However, imagemagick has more features. Animated GIFs and webm images are examples of some "video functionality" that imagemagick provides.
557 |
558 | In general, if you want to output an image you can [use the `-frames:v 1` option (example)](#frames) to extract and image.
559 |
560 |
561 |
562 | ## Extract a frame from a video
563 | ```
564 | ffmpeg -filter_complex 'color=color=white, drawtext=text=%{pts}, trim=duration=10s' counting.mp4
565 | ffmpeg -i counting.mp4 -vf 'trim=start=4s' -frames:v 1 four.png
566 | ```
567 |
568 | First we [generate a video that counts up (example)](#count) to sample from. Then we use the [trim filter (example)](#trim) [(doc)](https://ffmpeg.org/ffmpeg-filters.html#trim) filter to skip to four seconds into the video (using the `start`) option. We use the `frames:v` option to [extract a single frame (example)](#frames) and save this in a png.
569 |
570 |
571 |
572 |
573 |
574 |
575 | # Audio engineering
576 | FFmpeg is not explicitly designed for audio engineering and there a number of tools better suited for serious engineering. However, `ffmpeg` is performant, convenient for editing videos that need a little engineering and these examples are fun, interactive, and can demonstrate features
577 | ## Generating a sine wave using aevalsrc
578 | [aevalsrc (doc)](https://ffmpeg.org/ffmpeg-filters.html#aevalsrc) allows you to use a function to express a sound. Here we reimplement the [sine wave recipe](#sine) using this.
579 |
580 | ```
581 | ffplay -f lavfi -i 'aevalsrc=exprs=sin((128) *t*2*PI)'
582 | ```
583 |
584 | Here we use an expression involving `t` which is the time in seconds. [ffmpeg's own expression language] (https://ffmpeg.org/ffmpeg-utils.html#toc-Expression-Evaluation) which provides various functions.
585 |
586 |
587 |
588 | ## Generating some interesting sounds with aevalsrc
589 | ```
590 | # Create a sine wave that slowly changes frequency
591 | ffplay -f lavfi -i 'aevalsrc=exprs=sin((32 + 100* tanh(t/10) ) *t*2*PI)'
592 |
593 | # Create a square wave
594 | ffplay -f lavfi -i 'aevalsrc=exprs=gte(sin(250 * t * 2 * PI)\, 0)'
595 | ```
596 |
597 | All of these examples use [aevalsrc (example)](#aevalsrc-sine) [(doc)](https://ffmpeg.org/ffmpeg-filters.html#aevalsrc).
598 |
599 | In the second example we use the `sin` to give us something periodic with a known function and then use the greater than or equal function `gte` to turn this into a [square wave (wiki)](https://en.wikipedia.org/wiki/Subtractive_synthesis). Note how we escape
600 |
601 |
602 |
603 | # Interesting outputs for ffmpeg
604 |
605 | ## Display output rather than write it to a file
606 |
607 | You can use [ffplay (example)](#ffplay) to write to display a file out the output a filter. But this is a wrapper around features that `ffmpeg` provides.
608 |
609 | ```bash
610 | ffmpeg -filter_complex 'color=red[red]; color=blue[blue]; [red][blue]xfade=duration=5s, trim=duration=5s' spectrum.webm
611 | ffmpeg -re -i spectrum.mp4 -vf 'format=yuv420p' -f xv title
612 | ```
613 |
614 | `-re` tells `ffmpeg` to read the file at normal speed (otherwise ffmpeg would read as fast as possible). `format=yuv420p` converts to a format that `xv` accepts. `-re` does not work when you use [source filters (example)](#source) for these you have to use the [real-time filter(example)](#realtime)
615 |
616 |
617 | ## Limit the speed when using xv with filters
618 | The `xv` video allows ffmpeg to create real time video output. If you do not use the [`-re` filter (example)](#xv) or use [source filters (example)](#video-source-filters) then by default ffmpeg will run as fast as possible - even when writing to `xv`. You will not be able to watch the output and this will use 100% of the CPU. You can fix this use the [realtime (doc)](https://ffmpeg.org/ffmpeg-filters.html#realtime_002c-arealtime) filter.
619 |
620 | ```
621 | ffmpeg -filter_complex 'color=white, drawtext=text=%{pts}:fontsize=h, realtime' -f xv ''
622 | # This would run too fast:
623 | # ffmpeg -filter_complex 'color=white, drawtext=text=%{pts}:fontsize=h -f xv ''
624 |
625 | ffmpeg -filter_complex 'color=white, drawtext=text=%{pts}:fontsize=h, trim=duration=10s' video.mp4
626 | ffmpeg -i video.mp4 -vf 'realtime' -f xv ''
627 | ```
628 |
629 | The first example creates a stream counting up using [drawtext (example)](#drawtext). and then [sends this xv (example)](#xv). We specify `realtime` as the final filter to play at normal speed.
630 |
631 | > **See also**: [Play media with ffplay (example)](#ffplay)
632 |
633 |
634 | ## Playing videos on the terminal with kitty
635 | There [terminal emulator kitty](https://github.com/kovidgoyal/kitty) extended the the protocol that terminals used that it can render pixels. Some other terminal emulators have adopted this standard. You can use this to display the output from ffmpeg - including videos - directly in the terminal.
636 |
637 | This can be used to display images directly in the terminal. This is partly a neat trick - but it can have some benefits in terms of productivity and avoiding "paper cuts" while working because you does not have to move windows around. For many use cases one might prefer to use streaming to send output to a consistent window.
638 |
639 | The utility [timg](https://github.com/hzeller/timg) works well here.
640 |
641 | ```
642 | sudo apt-get install timg
643 | sudo apt-get install kitty
644 | kitty
645 | ffmpeg -loglevel 16 -filter_complex 'color=blue, drawtext=text=hi:x=w/3 + w/3*sin(t):y=h/3 + h/3*cos(t):fontsize=h/5' -f webm - | timg -V -
646 | ```
647 |
648 | First, we install `timg` for play videos in a terminal and `kitty` for a terminal that supports images. We then creating an interesting video using ffmpeg to [render some text (example)](#text) which is animated using an [expression in time (example)](#time-expression) to [change the position of our text (example)](#positioning).
649 |
650 | The output is written to [standard output](#programming), as specified by `-`, and then read from standard in by `timg`. Because there is no file name we must specify the format of the output stream using `-f`. We use `-loglevel 16` to hide logging about FFmpeg's normal operation which would otherwise get in the way of the video playing in your terminal.
651 |
652 | ## Controlling FFmpeg externally
653 | If you are using realtime outputs such as [X11 video output](#xv) or streaming it may make sense to control FF mpeg externally. You can do this using [commands (example)](#command-example) which can be [controlled from the command line or programs via zmq](#zmq).
654 |
655 |
656 | # Writing more readable filters
657 | The readability of code can be a bit of a trade off. What is easier to read for an expert may use concepts that the new user finds difficult to grasp. Code has an audience like prose, and the features used should cater for this audience. There are various approaches to change filters are written which may improve readability.
658 |
659 |
660 | ## Using source filters as input
661 | In some examples, we use complex filters with `-filter_complex` to support [source filters (example)](#source-filters). The output of source filters will often be [labelled (example)](#labels). You may be able to simplify such filters by moving parts of the filter out of the main filter and into an input source [of type lavfi (example)](#lavfi).
662 |
663 | Here is an example:
664 |
665 | ```bash
666 | ffmpeg -f lavfi -i 'color=red' -f lavfi -i 'color=green' -f lavfi -i 'color=blue' -filter_complex '[0][1][2]hstack=inputs=3, trim=duration=5s' stacked.mp4
667 | ```
668 |
669 | This command uses [`-f lavfi` (example)](#lavfi) to create three inputs consisting of solid red, green, and blue. We then combine this three inputs together with [hstack](https://ffmpeg.org/ffmpeg-filters.html#hstack-1) to produce a "flag" video. The argument of `-filter_complex` is definitely shorter as a result.
670 |
671 | > **See also**: [Combining video streams of different size(example)](#xstack), Combining audio filters [simulateneous (example)](#amix), [sequentially (example)](#concat-stream)
672 |
673 | ## Creating a filter graph in a file
674 | With `-filter_complex_script`, you can read a filter specification ("filter graph") from a file. Sometimes command can be more readable it details are separated out into a file.
675 |
676 | ```bash
677 | echo "color=red:duration=5s" > filter.lavfi
678 | ffmpeg -filter_complex_script filter.lavfi red.mp4
679 | ffplay red.mp4
680 | ```
681 |
682 | ## Omitting parameter names
683 | Parameters have an order that can you can view with `ffmpeg --help filter=color`. If you provide parameters in this order then you can omit the parameter name.
684 |
685 | Here is an example:
686 |
687 | ```bash
688 | ffmpeg --help filter=color
689 | ffplay -f lavfi -i 'color=white:100x100'
690 | ```
691 |
692 | We output a video with solid colour with a colour of `white` and `size` of `100x100`. We can omit both parameter names since the first parameter is `color` and the second is `size`.
693 |
694 | Here is some of the output from `ffmpeg --help filter=color`. Notice the order of parameters and the fact that some parameters have aliases which do not form part of the parameter order.
695 |
696 | ```
697 | color AVOptions:
698 | color ..FV.....T. set color (default "black")
699 | c ..FV.....T. set color (default "black")
700 | size ..FV....... set video size (default "320x240")
701 | s ..FV....... set video size (default "320x240")
702 | rate ..FV....... set video rate (default "25")
703 | r ..FV....... set video rate (default "25")
704 | duration ..FV....... set video duration (default -0.000001)
705 | d ..FV....... set video duration (default -0.000001)
706 | sar ..FV....... set video sample aspect ratio (from 0 to INT_MAX) (default 1/1)
707 | ```
708 |
709 | ## Using short or long flags
710 | Many of the flags for FFmpeg can long and short versions. Using long versions may make is easier to work out what an option is doing from context. Using short options can make the option short and reduce unnecessary reading.
711 |
712 | `-filter:a` can be written as `-af` and creates an [audio filter](#audio-filter).
713 | `-filter:v` can be written as `-vf` and creates a [video filter](#video-filter).
714 |
715 |
716 | ## Using flags or filters
717 | Some of the functionality of filters can be implemented instead as command-line flags.
718 | Using flags can pull complexity out of filters as a form of modularity. Converting flags into filters may result in a command that is easier to read for unfamiliar uses and more suitable for search. There may be features available for flags which are unavailable for features.
719 |
720 | `-ss` can be used to seek to a particular position to start with so that some material is skipped You can use the [`trim` filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#trim) instead of `-ss` with the [start option (example)](#trim-start).
721 |
722 | `-sseof` is analogus to `-ss` but skips material. Again the one can use the
723 | [`trim` filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#trim) instead of this with the end paramater.
724 |
725 |
726 |
727 |
728 | # Getting information about FFmpeg
729 | FFmpeg has various functions to provide documentation and information about how to run FFmpeg by calling FFmpeg itself.
730 | ## Listing objects
731 | FFmpeg has various types of objects that form part of commands. These objects can be listed like so:
732 |
733 | ```bash
734 | ffmpeg -colors
735 | ffmpeg -filters
736 | ffmpeg -devices
737 | ffmpeg -devices
738 | ffmpeg -codecs
739 | ffmpeg -decoders
740 | ffmpeg -encoders
741 | ffmpeg -protocols
742 | ffmpeg -filters
743 | ffmpeg -pix_fmts
744 | ffmpeg -sample_fmts
745 | ffmpeg -sinks pulse
746 | ```
747 |
748 | See [`man ffmpeg`](#man-pages) for details.
749 |
750 | > **See also**: [Listing ffmpeg filters (example)](#list-filters) , [FFmpeg documentation](#documentation)
751 |
752 |
753 |
754 | # Using commands
755 | [Commands (doc](https://ffmpeg.org/ffmpeg-filters.html#sendcmd_002c-asendcmd) ([example](#command-example)) are a general mechanism to modify the behaviour of filters while they are running such as by changing the value of [parameters (example)](#parameter-example).
756 |
757 | > **See also**: [Expressions](#expressions-topic)
758 |
759 | ## Check that parameter can be updated by a command
760 | The [help for a filter (example)](#parameters) indicates whether a filter can be modified by a command with the letter `T`
761 |
762 | ```
763 | ffmpeg --help filter=drawtext
764 | ```
765 |
766 | Here, in the output of `ffmpeg --help filter=drawtext`. We can see see that its text can be updated by commands, but that `fontfile` and `textfile` (if used) cannot be.
767 |
768 | ```
769 | Filter drawtext
770 | Draw text on top of video frames using [D[C`libfreetype` library.
771 | Inputs:
772 | #0: default (video)
773 | Outputs:
774 | #0: default (video)
775 | drawtext AVOptions:
776 | fontfile ..FV....... set font file
777 | text ..FV.....T. set text
778 | textfile ..FV....... set text file
779 | ```
780 |
781 | Many commands such as `sine` have parameters that cannot be updated even though you might expect them to be mutable.
782 |
783 |
784 | ## Update commands from the command line with zmq
785 | The [zmq filter (doc)](https://ffmpeg.org/ffmpeg-filters.html#zmq_002c-azmq) allows one to read commands to control filters from an external source. `zmq` is a popular protocol used for light-weight messages buses. There is command-line tooling for zmq.
786 |
787 | ```
788 | pipx install zeroless-tools
789 | ffplay -f lavfi -i 'color, zmq' &
790 | echo 'color color orange' | zeroclient 5555 req
791 | echo 'color color purple | zeroclient 5555 req
792 | kill %
793 | ```
794 |
795 | This example installs tooling for use on the command-line using pipx. We the start playing a filter that outputs solid color ans uses zmq in the filter chain allowing this color to be modified. We send a message to the color filter that its color should be changed to orange, then purple. By default the zmq protocol listens on port 5555.
796 |
797 | `zmq` supports different modes of message passing - we seem to need to use the request protocol, indicated by `req` when we use zeroclient.
798 |
799 | ## Create a remote control for a video being played
800 |
801 | ```
802 | ffplay -f lavfi -i 'sine=volume=sin(t)'
803 | ```
804 |
805 |
806 | # Index of language features
807 | At its core, FFmpeg applies filters on input, feeds the result into other filters, and then produces output.
808 | There are more filters available than this cookbook could cover. You can [list filters](#list) from the command line.
809 |
810 | Filters take a number of inputs and outputs. [Source filters](#source) have no inputs or outputs. Filters take [parameters](#params) which are expressed in the form [`key=value`](#params) and [separated by colon (`:`) characters](#colons). Parameters have an order and you can [omit the parameter name](#omit-name) if parameters are given in this order. For some filters, for some parameters you can use [expressions](#expressions-topic) to calculate a value based on things like the width and height of the frame and the timestamp.
811 |
812 | You can connect up filters that take one input stream and have one output stream [using ,](#comma). Alternatively you can create a named stream and write to it by putting this is [square brackets after an expression](#output-filter).
813 |
814 | You can then run separate filters in parallel by [separating them with a ;](#parallel-stream). When doing so, it is often necessary to label streams, both for [output](#output-stream) and [input](#input-stream). Some filters such as [concat](#concat-stream) take multiple inputs and can produce multiple output.
815 |
816 | # Filters
817 |
818 | There are more filters than we can complete document here and there is already reference document. You can see documentation for filters with `man ffmpeg-filters` or [via the ffmpeg website](https://ffmpeg.org/ffmpeg-filters.html). There is a [syntax for expression parameters to filters](https://ffmpeg.org/ffmpeg-filters.html).
819 |
820 | You can [show help for a filter](#filter-help) and [list as filters](#list-filters).
821 |
822 |
823 | # Expressions
824 | Some filters support [programmatic expression(doc)](https://ffmpeg.org/ffmpeg-utils.html#toc-Expression-Evaluation) in some of their parameters. Filters tend to decide for themselves how they handle expressions, for example [drawtext has a different expression language (example)](#drawtext) using a different syntax, but most of the time if a filter's parameter use expressions it is using this language - but with a different set of variables ([ex1](#variables-1), [ex2](#variables-2)).
825 |
826 | You can see the documentation for the shared parts of the expression language (such as functions) with [`man ffmpeg-utils`](#man-pages)
827 |
828 | Expressions are useful for "continuous data", but you can add discrete behaviour using the `if` function. For discrete behaviour, [commands](#commands-examples) can be more useful. Just as a filter decides for itself if an expression can be used for a parameter, it decides which of its parameters can be updated with commands..
829 |
830 | Examples of parameters supporting expressions include [drawtext's x](#drawtext) and [aevalsrc's expr](#aevalsrc)
831 |
832 |
833 | It may be useful to understand some [programming concepts](#programming) to use expressions , such as *escaping*, *variables* and *functions*.
834 |
835 |
836 |
837 | # Documentation
838 |
839 | ## Manual pages
840 | Manual pages run through the command `man` from the [command-line](#programming) are a commonly used way to provided easy to access documentation from the command-line.
841 |
842 | FFmpeg provides various manual pages. You can show these with the command `apropos ffmpeg`. There is online documentation that corresponds to each man page which we link to here:
843 |
844 | * [ffmpeg (doc)](https://ffmpeg.org/ffmpeg.html)
845 | * [ffmpeg-all (doc)](https://ffmpeg.org/ffmpeg-all.html)
846 | * [ffmpeg-bitstream-filters (doc)](https://ffmpeg.org/ffmpeg-bitstream-filters.html)
847 | * [ffmpeg-codecs (doc)](https://ffmpeg.org/ffmpeg-doecs.html)
848 | * [ffmpeg-devices (doc)](https://ffmpeg.org/ffmpeg-devices.html)
849 | * [ffmpeg-filters (doc)](https://ffmpeg.org/ffmpeg-filters.html)
850 | * [ffmpeg-formats (doc)](https://ffmpeg.org/ffmpeg-formats.html)
851 | * [ffmpeg-protocols (doc)](https://ffmpeg.org/ffmpeg-protocols.html)
852 | * [ffmpeg-resampler (doc)](https://ffmpeg.org/ffmpeg-resampler.html)
853 | * [ffmpeg-scaler (doc)](https://ffmpeg.org/ffmpeg-scaler.html)
854 | * [ffmpeg-utils (doc)](https://ffmpeg.org/ffmpeg-utils.html)
855 |
856 | ## Web pages
857 | FFmpeg provides [reference documentation online](https://ffmpeg.org/ffmpeg.html).
858 |
859 | > See [alternatives](#alternatives) for documentation similar to this.
860 |
861 |
862 |
863 | # General programming and command-line knowledge
864 | FFmpeg is a command-line tool written for the kind of user that uses command-line tools. This has certain affordances and norms which FFmpeg will assume and make use. Completely covering concepts related to general computer use would distract from the article and take time. However, I can provide links to certain that someone more interested in FFmpeg from an artistic or practical angle might not know.
865 |
866 | Once you have derived some value from this guide, you might like to review all these concepts to speed up your understanding of the rest of the material.
867 |
868 | * [Escaping](https://en.wikipedia.org/wiki/Escape_character). Used in [audio filters](#audio-filters) and as part of the [expression language (example)](#expression-example) [(topic)](#expressions-topic).
869 | * FFmpeg's [expression language (example)](#expression-example) makes use of [variables](https://en.wikipedia.org/wiki/Variable_\(computer_science\)) and [functions](https://en.wikipedia.org/wiki/Function_\(computer_programming\))
870 | * FFmpeg provides a [command-line interface](https://en.wikipedia.org/wiki/Command-line_interface) which is often run from a [shell](https://en.wikipedia.org/wiki/Unix_shell), commonly [bash](https://en.wikipedia.org/wiki/Bash_(Unix_shell)).
871 | * [Linux](https://en.wikipedia.org/wiki/Linux) is an operating system that you can use on your system which is often easier to program for but may lack commercial support
872 | * [man](https://en.wikipedia.org/wiki/Man_page) is a command line tool available on many operating systems like Linux and Mac that can be used to quickly obtain documentation from the command-line
873 | * [Zero-based based numbering](https://en.wikipedia.org/wiki/Zero-based_numbering). Some filters such as [xstack] start counting from 0 for some values.
874 | * [printf](https://en.wikipedia.org/wiki/Printf) is a way of formatting data into text used in C as well as other languages.
875 |
876 |
877 | # General audio visual concepts
878 | This guide assumes understanding of certain concepts related to images and videos. We will note cover this material but this provides a glossary.
879 |
880 | * [Aspect ratios (wiki)](https://en.wikipedia.org/wiki/Aspect_ratio_(image)) describes the ratio of the width of an image or video to its height. When [scaling (example)](#scale) you may wish to maintain the aspect ratio.
881 | * [Pixels](https://en.wikipedia.org/wiki/Pixel) an image consists of a grid of pixels each of which is a certain colour.
882 | * [Coordinate](https://en.wikipedia.org/wiki/Coordinate_system) pixels in an image can be labelled with horizontal distance from the left edge and the vertical image from the top of the image.
883 |
884 | # Support
885 | If you found this guide helpful you can incentivize me to work it and similar things by [donating to my ko-fi](https://ko-fi.com/c/91cb0010ae).
886 |
887 | You could also [book an online tutorial on this material](https://ko-fi.com/c/6ee072d36c).
888 |
889 | Alternatively, simply reading [other documentation I have written](https://readwithai.substack.com/p/my-command-line-documentation) supports my (varied) work.
890 |
891 | # About me
892 | I am **@readwithai**. I make tools for research, reading and productivity - sometimes with [Obsidian](https://readwithai.substack.com/p/what-exactly-is-obsidian).
893 |
894 | I am interested in `FFmpeg` because I am [a fan of the command-line](https://readwithai.substack.com/p/my-productivity-tools) and occasionally do image editing and video-editing.
895 |
896 | If you liked this this guide you might like to:
897 |
898 | 1. Have a look at some of [my productivity and command line tools](https://readwithai.substack.com/p/my-productivity-tools);
899 | 2. Read [my review of note taking in Obsidian](https://readwithai.substack.com/p/note-taking-with-obsidian-much-of); or
900 | 3. Read my [cookbook for the Obsidian note taking tool](https://open.substack.com/pub/readwithai/p/obsidian-plugin-repl-cookbook) and my open source Emacs inspired automation plugin [Plugin REPL](https://github.com/talwrii/plugin-repl).
901 |
902 | You can follow me on [X](https://x.com/readwithai) where I write about productivity tools like this besides other things, or my [blog](https://readwithai.substack.com/) where I write more about reading, research, note taking and Obsidian.
903 |
--------------------------------------------------------------------------------