The rectangle behind you

The making of Four Laps

Marcin Wichary
May 29 · 25 min read

On May 27, 2021, during an internal lightning talk session at Figma, I gave a remote “Ignite” 5-minute talk.

(This is a live recording; there was no video editing after the talk.)

I love Ignite talks and occassionally experiment with them, but this was the first one I’ve done remotely, and I’ve never spent so much time and effort putting together tech for it (mostly using OBS, which is an exciting experimental software used by streamers and remote presenters), and practicing.

In this post, I’ll be sharing a lot of details of how I did this. I don’t assume you will want to create another video like this (although why not?), but you might find bits and pieces of what I learned interesting.

This article has three sections:

  1. The creative process and the setup
  2. Specific tech questions
  3. What I learned about controlling and scripting OBS (incl. all of my code)

Hope you enjoy!

1. The creative process and the setup

Why did I do this? It seemed like a weird gimmick and a fun challenge. Like with many of my weird creative projects, for most of the time I assumed that this will fail — after all, I haven’t heard of anyone doing something like this, and I had no idea if the tech was even available — but at least I will learn something.

Very quickly, I came up with basic questions:

  • Can one do compositing in OBS? (can I overlap myself?)
  • Can one feed the recording into itself as playback?
  • How capable is OBS scripting? Can I do everything programmatically?
  • Do I know how to make a consistently white background?
  • Can I practice enough for this to feel good?
  • Do I actually have something to say here?

I started answering the big ones first, and soon each one of them was replaced by a fractal of small ones. You’ll see most of them below.

The story

I knew I wanted to include Come into my world, Opportunities, and Tango. I quickly created a spreadsheet to try to map it out:

I originally planned to have five laps — one for each section — but this would leave less than a minute for each one. This felt tight time-wise, and also I didn’t know if I could make myself small enough using my camera/lens.

So, I ended with four laps, 70 seconds each, with a little room at the end to breathe. This coincidentally was exactly the structure of Come into my world, except that video’s laps were 61 seconds each.

I thought it’d be fun to mess in time in another way, by moving backwards — from 2002, to 1986, to 1980.

Quickly after establishing the early structure, I wrote down my script and started reading it out loud to gauge time needed:

As always with Ignite talks, I realized there is much less time than I expected: I had to drop more details about Minogue, there was a whole passage about rotoscoping her hair, and later on a reference to a Stanisław Lem short story about multiple Ijon Tichys.

In terms of a story, I struggled with ending, eventually arriving at a (cute?) hypothetical of describing something I was already doing… not sure if it ever quite landed.

I started practicing in front of camera as soon as I could, seeing how much time I need for certain things (e.g. putting the jacket on), and how do the gestures feel.

It was interesting to me how the structure pushed back on some of the ideas:

  • I didn’t originally plan to fast forward, but that would make the first lap much too long. I hated this for the longest time, but now I think it works nicely.
  • Originally, I wanted to put on the hat at the end at the exact moment I take it off originally — creating an impression this was the same hat — but the structure actually made it impossible without introducing a visible fifth lap!

Lastly, here are some of my ideas that didn’t happen:

  • programmatically changing colours of the shirt or making me lighter with each loop
  • cough at the same time (too hard to sync up)
  • duck/throw something (dangerous)
  • yawn (weird pause)
  • subvert the glasses thing at the end by wearing weird glasses (didn’t fit with the story)
  • instead of spelling F·O·U·R, do something more sophisticated (didn’t have a better idea)
  • some more advanced choreo (ran out of time)
  • turn on the light above me or pull up a banner or change the environment (ran out of time)

The setup

These are photos of the setup from both sides:

This is the diagram of the entire setup:

  • Mac computer: This computer was grabbing the video from my camera, then in OBS the recording from that camera was fed back into OBS as video input after a 70s delay via a set of Lua scripts I wrote. The video camera source was always on top, so it required color key so that the white was “transparent” and the delayed videos in the background could be seen.
  • Windows computer: The video output from the Mac computer was fed to a Windows computer’s OBS and composited with a pre-recorded 5-minute After Effects video that contained titles, “slides,” and all the music via another set of Lua scripts I wrote. The Mac computer output was always on top, and required a similar color key as above to allow the slides to be “behind me.” This computer also grabbed the sound from the mic, recorded everything, and sent things to Zoom.
  • iPad: The iPad was responsible for showing the “timer” that alerted me to various events that were supposed to happen — it was like live quick time events. Most of the events were announced in real time, but for crucial things like moving to a new position (where I absolutely needed to do it at the right moment) there were counted down.
  • There was also a fourth computer, but this was mostly used to connect to the Zoom independently, and just wait for my cue during the lightning talk session.

I’m surprised this actually worked

I almost quit many times:

  • when the first attempts to script OBS resulted mostly in crashes, and when I realized how poorly documented everything is
  • when I realized I would have to have two computers
  • when my other Mac was too slow to handle OBS
  • when the Windows computer that replaced it worked well for a while, and then stopped performing well

This was a project that reminded me of very early days of personal computers, where the technology was so young, the documentation so scarce, and the reliability so low, that you really needed to want it. I guess I really wanted this? But also, at some point, I wanted to keep going because I thought writing down

I learned a lot of little things from this: OBS scripting, Lua, After Effects, how to create a white background, how to make a blooper video (keep reading!), how to mount an iPad and use it as a helper, and even how to (very poorly) apply make-up.

The follow two sections have more details about the tech, all the way to source code:

2. Specific tech questions

How was the video looped?

In OBS, at least on a Mac, you can add the video of the current recording (if you figure out the filename) to the scene as a regular video. You can’t do it immediately, but since my loops were 70 seconds, that leaves more than enough time.

This is a snippet from a pretty creepy-looking early timing practice, with a lap length of 8 seconds (you will notice that it’s in reverse order, with the youngest me behind the older ones):

How did you make the slides/titles/etc.?

I used After Effects, and it was a pretty straightforward work.

I later used VLC to transcode the After Effects output to a .ts file that had a more manageable size. (ffmpeg also would’ve been great.)

How did you do the little rotoscoping bit?

There was a (short) video of the making of Come into my world from which I lifted the camera crane snippet, but I didn’t see any videos showing actual rotoscoping. So I used the After Effects function called Roto Brush, which was nice — and kind of magical — although really slow.

How did you know where to stand?

I put sticky notes on the floor and the table I used. Since the table wasn’t wide enough, I taped the sticky to a chopstick:

What kind of white screen did you have?

I don’t remember. I bought it a while back. It’s cloth.

I generally never quited figured out the best way to set this up. I put a random assortment of diffused lights in the back, and two diffused lights in front, but it never quite coalesced, and was painful to keep white — particularly at the edges and in the corners, where there was always some falloff (my space is very small).

I used a LUT to help with the background, and increased brightness/contrast as an OBS filter just enough. It was also important to switch my camera’s metering mode to “entire screen average.”

This is an example of me testing the light:

Why white screen and not green screen?

I think green screen is probably the way to go — see fringing in my video that I could never fully get rid of — but I already had a white background. I was also worried green screen would require even more lights. Something to try for next time!

How did you make the timer?

I mounted an iPad on a stand just below the camera, and wrote myself a little HTML/JavaScript app. It’s not very complicated.

You can see the timer here (click or press Enter to activate) or check out the source code.

Notes:

  • most of the UI is at the top of the screen. It was originally centered vertically, but I noticed that this resulted in my eyes moving on the screen much more visibly
  • there are special annotations when the music plays loudly. I actually didn’t hear any music as I was speaking, because I was worried it would create feedback or be recorded twice. So I needed a visual indication that music is playing so that I wouldn’t speak at that time.

If you are interested, here’s an entire talk with the timer overlaid so you could see what I saw. (This is a different take than above, because… why not.)

How did you mount the iPad below the camera?

I got this stand.

How did you mount the Windows computer close to the camera, so it could serve as a monitor?

I put it on my trashcan, and I put the trashcan on my piano seat. D’oh.

How much did you practice?

I recorded all of my practice videos, and so I know it was exactly 56 practice runs before the final one: ten reading the script, and 46 actually saying things from memory.

You can see here how it all changed over time:

What kind of mic did you use?

Since I was moving a lot, I thought the only reasonable way would be to use a lavalier mic. I also thought a cabled one would be better so I don’t rely on wireless. This is the one I used.

I think the lav in the videos introduces a bit too much noise and also it’s coming in a bit hot. Something to improve for the next time!

How to connect the lav mic to a Windows laptop?

I bought a little cheap device that connected via USB and exposed a mic input.

How did you prepare for failure?

This was incredibly complex system with a lot of moving parts I didn’t fully understand or control. So I did a lot of work to prepare myself for contingencies:

  • a few days before, I made a “release candidate” video recording to have in the backpocket for the audience
  • in the next few days, I kept replacing that video as I kept improving
  • I had a pretty extensive checklist for the day off (this is only about 25% of it) and also a last-minute checklist on the wall:
  • for the Mac, the checklist started from actually rebooting (sometimes connecting HDMI would make it go into an endless loop)
  • I made both OBS have a deterministic start state (by default, OBS starts the way you left it last)
  • on my Mac, I had this wallpaper set up just in case OBS decided to randomly crash in the middle (which it did once in a while):

What kind of camera did you use and how did you connect it?

I have a Sony Alpha a7R III. I used a 20mm (relatively wide angle) prime lens.

I had the ISO and focus set to manual. Also, I had the metering mode switched to “entire screen average”— that helped a lot.

I used a mini-HDMI/HDMI cable and connected it via Cam Link 4K to my Mac. That exposed it as a virtual camera that I could just use in Zoom (but not, for some reason, FaceTime). This is normally how I dial into work meetings.

Lastly, I used a fake battery connected to the mains.

How did you maximize room?

I don’t have a huge apartment, so I tried a few tricks:

  • wide angle lens
  • tilted the camera a bit up
  • I didn’t know how to raise the camera so I didn’t cover so much of the screen… so instead I moved the recording down by 100 pixels. (Just needed to be careful not to move my hand up too much so they’d appear clipped.)

How did you pipe the video from Mac into the Windows computer?

I changed my MacBook’s resolution to 1920×1080 (non retina). I made OBS go full screen, I disabled True Tone and Night Shift, and I used (another, borrowed) Cam Link 4K with an HDMI cable.

(My friend recommended a cheaper capture device, but I never got it to work reliably in 1080p, only in 720p.)

Why did you use a Windows computer as a second computer?

I had a second MacBook, and an older Surface Book (first generation). I originally used the former, but it couldn’t handle even the compositing required, let alone also outputting to Zoom.

What I learned was that OBS on Mac for some reason isn’t hardware accelerated, but it is on Windows. So I dusted off the Surface Book, changed my code a bit, and run it there.

It handled everything swimmingly! Well…

How did you make Windows computer perform better?

After many days of working well, for some reason at some point the OBS on Windows started behaving poorly — skipping frames, choppy mouse cursor, that kind of stuff.

Nothing seemingly changed in the setup, and internet research proved inconclusive. I was despairing enough to consider buying a newer Windows laptop just for this… until I realized that just reducing the screen resolution to something much smaller (1920×1080, non retina) made it behave okay again.

The Windows computer used the virtual camera for Zoom, and recorded to file, and they were both using 1920×1080 regardless of the screen resolution. And I only looked at the screen as a monitor a few times during the presentation (the timer was the main source of info), so the actual resolution didn’t matter.

How did you share the video to Zoom?

Normally, Zoom combines two separate video sources:

  • your webcam (prioritizes framerate over quality)
  • your screen share (prioritizes quality over framerate, although there is an option to make it more closer to the webcam if you’re sharing video). Also, no pixels are cropped

If you are sharing with OBS, you can combine your screen and your webcam and potentially many other sources into one frame before Zoom even gets to it. And you can also send it back to Zoom in two different ways:

  • as a webcam (making OBS create a fake webcam in your system that basically has your OBS screen)
  • as a “projector” or just screen sharing an OBS window

It’s the former that I chose for this project, as I cared more about framerate than quality.

How did you share the audio to Zoom?

Unfortunately, the virtual camera doesn’t have the audio component.

I had to install a piece of software called Voice Meeter, which allowed me to redirect my microphone input from the USB dongle to a virtual output called B1. And this virtual output was also set as the output in Windows, ensuring the music from the pre-recorded slides was also mixed in.

(You might wonder why I didn’t just allow the microphone to be output to speakers in OBS. However, that also made the microphone be recorded twice in the recording, with a small delay, creating an awkward echo.)

Here are my settings:

Please note that the computer itself was muted, so I didn’t hear anything as I was talking.

How did you solve the issue of Voice Meeter audio getting garbled?

Voice Meeter’s audio engine sometimes makes the audio garbled and robotic (choppy). This seems to just happen once in a while.

You can fix this by choosing A1 menu and resetting the audio engine there.

To know whether my audio was garbled or not, I temporarily unmuted my computer, enabled A1 output in the first column (HARDWARE INPUT 1), and then plugging headphones to avoid feedback. If I could hear my microphone in my ear correctly, it means it worked. If not, I reset the audio engine as many times as it was necessary.

How did you solve the sync issue between video and audio?

Since I was recording the video on one computer, and audio on the other, there was a delay betwen video and audio as recorded and as sent to Zoom.

I actually didn’t solve it for Zoom — I didn’t get to it, thinking it won’t be very noticeable, since it was Zoom and I wasn’t that big on the screen.

For the recording, I adjusted the delay using ffmpeg:

ffmpeg -ss 00:00:28.00 -i input.mkv -itsoffset 0.3 -ss 00:00:28.00 -i input.mkv -map 0:v -map 1:a -c copy -t 05:00:00 output.mkv

How did you make Zoom go 1080p?

I don’t think I did. Don’t ask me more Zoom questions; I really hate Zoom.

How did you make sure the three devices started at the same time?

I connected a regular Logitech presentation remote to the Mac computer. In OBS on that computer, I assigned the script to the right arrow key.

Then, my Lua script would ping my server upon receiving that event, telling it to switch its time stamp to the current date.

The Windows computer and the iPad with the timer were both asking the server every 500ms for the saved timestamp. If they noticed the timestamp being newer than the last one, they started their own parts.

This wasn’t very precise, of course, but it was precise enough for me.

(Also, this was an external server, which was a bad idea — it would’ve been better for it to be internal, but then again I needed internet to give that talk, so…)

(Also, I used regular GET/PUT, rather than something smarter like websockets. Seems to have worked well.)

What font is that?

Bild Compressed from David Jonathan Ross.

How did you stop your cat from interfering with your practice or messing up your sound by meowing?

Trick question. You can’t control cats.

3. What I learned about controlling and scripting OBS

I found OBS to be pretty poorly documented – there is a bit of documentation, but few solid examples… and there is a community of people who share scripts, but the coverage seems low.

I put up my source in GitHub here (with some comments):

And below I’ll tell you every tricky part about OBS that I learned in the process.

Note: All of the below require this line, which seems to be a standard way to shorten obslua in all the scripts I’ve seen online:

local obs = obslua

Why OBS and not Streamlabs OBS?

I couldn’t get the virtual camera to work on my Mac with Streamlabs OBS, but could in OBS. (Eventually, I ended up not needing it, but was already used to OBS by then.)

How to log to OBS’s console (called Script Log)?

I found this somewhere:

local function log(name, msg)
if msg ~= nil then
msg = " > " .. tostring(msg)
else
msg = ""
end
obs.script_log(obs.LOG_DEBUG, name .. msg)
end

How to effectively iterate in OBS?

OBS crashed a lot, particularly if I wanted to run the same script again. It’s possible that I was doing something wrong — perhaps not releasing the resources properly, etc.

But it’s also a bit clunky to reload the scripts since there is no keyboard shortcut, and the button is in a window rather than exposed.

So I learned that the best way to iterate is to quit OBS and quickly open it again. Qutting even as the file was being recorded seemed to have finished the recording properly, and restarting the app forced my script to reload, too.

One challenge was that OBS remembers its state (all the scenes and items) between openings — so I had to make OBS forget the latest changes and always wake up in a predefined state:

How to start OBS in a certain state (Mac)?

When I was happy with my OBS settings, I duplicated the entire directory ~/Library/Application Support/obs-studio and renamed the result obs-studio-kylie

I wrote this shell script:

rm -rf ~/Library/Application\ Support/obs-studiocp -R ~/Library/Application\ Support/obs-studio-kylie 
~/Library/Application\ Support/obs-studio
open /Applications/OBS.app

I named the file startObs.sh.command — the last addition allowed me to drag it into the dock.

Lastly, in the Terminal app, I went to Preferences, and then Shell, and set When the shell exists: to Close if the shell exited cleanly. This made sure that my script didn’t leave behind a terminal window. (Which would be visible to the audience if OBS crashed.)

How to start OBS in a certain state (Windows)?

I created a .bat file. Instead of replacing the whole setup, it only deletes all the scenes, and replaces with the new one. I probably would’ve deleted the entire directory instead, like above, for simplicity:

del /q c:\users\mwich\AppData\Roaming\obs-studio\basic\scenes\*.*copy c:\users\mwich\desktop\kylie\bat\z.json 
c:\users\mwich\AppData\Roaming\obs-studio\basic\scenes
cd "C:\Program Files\obs-studio\bin\64bit"
obs64.exe

How to make OBS go full screen?

I wanted a pure 1920×1080 output from OBS so I could redirect it to the Windows computer.

Normally, one would use a virtual camera or a recording, but this needed to go live over HDMI. I think long time ago I’ve used something called a projector, but:

  • the last time I used them, they still had bits and pieces of UI
  • I only had one display, and I felt this would’ve made things complicated
  • I remembered it taking extra CPU
  • I actually, believe it or not, couldn’t find it in the UI

The solution I found was basically configuring OBS to remove all the UI:

  • I toggled all of these options off, and used Lock UI so that they wouldn’t reset
  • In the Edit menu (why there???), I opened Preview Scaling, and changed it to be 1:1 (otherwise even in fullscreen, there was a bit of a bezel)
  • I made OBS go fullscreen via Mac’s regular “green button”

How to move an item in the scene?

This is how to move it down by 100px:

local item = obs.obs_scene_add(current_source, camera)
local pos = obs.vec2()
pos.x = 0
pos.y = 100
obs.obs_sceneitem_set_pos(item, pos)

How to resize an item?

I ended up not needing that, since all of my items were matching the 1920×1080 resolution of the whole scene, but just in case:

local item = obs.obs_scene_add(current_source, camera)
local scale = obs.vec2()
scale.x = 0.9375
scale.y = 0.9375
obs.obs_sceneitem_set_scale(item, scale)

How to move an item in the z-index?

local item = obs.obs_scene_add(current_source, camera)
obs.obs_sceneitem_set_order(item, obs.OBS_ORDER_MOVE_BOTTOM)

How to remove all items in a scene?

local sceneitems = obs.obs_scene_enum_items(current_source)for i, sceneitem in ipairs(sceneitems) do
local sourceSrc = obs.obs_sceneitem_get_source(sceneitem)
local type = obs.obs_source_get_id(sourceSrc)
local settings = obs.obs_source_get_settings(sourceSrc)
obs.obs_sceneitem_remove(sceneitem)
end

How to prevent a red border from appearing around things?

Newly-created items are selected and therefore have a red border around them. This doesn’t matter for a virtual camera (used by Zoom) or for recording, but it matters when you capture the Zoom UI to another computer as per above.

I could not find a way to deselect items programmatically in OBS, but locking them does the tricky:

local color_item = obs.obs_scene_add(current_source, color_source)
obs.obs_sceneitem_set_locked(color_item, true)

How to set up a color key filter programmatically?

local recording_filter_settings = obs.obs_data_create()obs.obs_data_set_int(recording_filter_settings, "key_color", 0xffffffff)
obs.obs_data_set_string(recording_filter_settings, "key_color_type", "custom")
obs.obs_data_set_int(recording_filter_settings, "similarity", 57)
obs.obs_data_set_int(recording_filter_settings, "smoothness", 94)
local recording_filter = obs.obs_source_create_private("color_key_filter", "Recording Filter", recording_filter_settings)obs.obs_source_filter_add(camera, recording_filter)
obs.obs_data_release(recording_filter_settings)
obs.obs_source_release(recording_filter)

How to fade out?

First, create a completely white scene:

fadeout_scene = obs.obs_scene_create('Fadeout')
local new_something = obs.obs_scene_get_source(fadeout_scene)
local new_source = obs.obs_scene_from_source(new_something)
local color_settings = obs.obs_data_create()obs.obs_data_set_int(color_settings, "width", 1920)
obs.obs_data_set_int(color_settings, "height", 1080)
obs.obs_data_set_int(color_settings, "color", 0xffffffff) -- White
local color_source = obs.obs_source_create("color_source",
"Color", color_settings, nil)
local color_item = obs.obs_scene_add(new_source, color_source)
obs.obs_sceneitem_set_locked(color_item, true)
obs.obs_data_release(color_settings)

Then, set a fadeout timer:

obs.timer_add(start_fadeout, (5 * 60 + 25) * 1000) -- 5m25s

And then, transition:

function start_fadeout()
obs.timer_remove(start_fadeout)
local new_something = obs.obs_scene_get_source(fadeout_scene)local transitions = obs.obs_frontend_get_transitions()
local fade = find_source_by_name_in_list(transitions, "Fade")
obs.obs_frontend_set_current_transition(fade)obs.obs_transition_start(fade, obs.OBS_TRANSITION_MODE_AUTO,
2000, new_something)
end

How to only fire a timer once?

You initialize a timer this way:

obs.timer_add(something, 2000)

This will call a something function every 2 seconds. To do it only once, do this:

function something() {
obs.timer_remove(something)
...
}

Yeah, I know, it’s weird.

How to add CamLink capture programmatically? (Mac)

local camera_settings = obs.obs_data_create()obs.obs_data_set_string(camera_settings, "device", "0x1100000fd90066") -- Cam Link obs.obs_data_set_string(camera_settings, "preset", "AVCaptureSessionPresetHigh") -- This means 1920x1080local camera = obs.obs_source_create("av_capture_input", "Camera video", camera_settings, nil)
local item = obs.obs_scene_add(current_source, camera)
obs.obs_data_release(camera_settings)

How to add CamLink capture programmatically? (Windows)

Very similar, but some parameters are named differently :

local camera_settings = obs.obs_data_create()obs.obs_data_set_string(camera_settings, "video_device_id", "Cam Link 4K:\\\\?\\usb#22vid_0fd9&pid_0066&mi_00#227&71832f1&0&0000#22{65e8773d-8f56-11d0-a3b9-00a0c9223196}\\global")local camera = obs.obs_source_create("dshow_input", "Camera video", camera_settings, nil)
local item = obs.obs_scene_add(current_source, camera)
obs.obs_data_release(camera_settings)

How to get the necessary CamLink id? (Mac)

After opening System Information and going to Camera, the Unique ID is listed here. Then, you can plug it here:

obs.obs_data_set_string(camera_settings, "device", "0x12000000fd90066")

Note that the first digits of the id will change if you connect it to a different USB port… so don’t do it.

How to get the necessary CamLink id? (Windows)

If you go to Device Manager, then Cameras, then Details, and then just sort of click around, you will find all the necessary part to construct the id:

Here’s an example:

obs.obs_data_set_string(camera_settings, "video_device_id", "Cam Link 4K:\\\\?\\usb#22vid_0fd9&pid_0066&mi_00#227&71832f1&0&0000#22{65e8773d-8f56-11d0-a3b9-00a0c9223196}\\global")

How to know the various names of settings for various items you can put in the scene?

This seems (or is) poorly documented. The best luck I’ve had was looking at the source code.

For example, wanting to know how to programmatically create a color key filter, I looked at settings here in GitHub to know the right string ids, and then looked further down in the file to see whether they’re numbers, strings, bools, etc.

To find the filter id I needed, I searched for .id and found it here.

This led me to knowing that I need to do this, in this example:

local recording_filter_settings = obs.obs_data_create()obs.obs_data_set_int(recording_filter_settings, "key_color", 0xffffffff)
obs.obs_data_set_string(recording_filter_settings, "key_color_type", "custom")
obs.obs_data_set_int(recording_filter_settings, "similarity", 57)
obs.obs_data_set_int(recording_filter_settings, "smoothness", 94)
local recording_filter = obs.obs_source_create_private("color_key_filter", "Recording Filter", recording_filter_settings)obs.obs_source_filter_add(camera, recording_filter)
obs.obs_data_release(recording_filter_settings)
obs.obs_source_release(recording_filter)

(Random note: it seems that color_key_filter has a v2 version! Either this is new, or I didn’t notice it before.)

How to figure out what are the right setting values if those are not obvious?

What I found useful is creating the right item by hand, and then writing a script to inspect and output the setting:

local sceneitems = obs.obs_scene_enum_items(current_source)
for i, sceneitem in ipairs(sceneitems) do
local sourceSrc = obs.obs_sceneitem_get_source(sceneitem)
local type = obs.obs_source_get_id(sourceSrc)
local settings = obs.obs_source_get_settings(sourceSrc)
log('local_file', obs.obs_data_get_string(settings, "local_file"))
log('looping', obs.obs_data_get_bool(settings, "looping"))
end

That was useful, for example, to know the expected values of these two settings, for example:

obs.obs_data_set_string(camera_settings, "device", "0x1100000fd90066")
obs.obs_data_set_string(camera_settings, "preset", "AVCaptureSessionPresetHigh")

How to start and stop recording?

Easy:

obs.obs_frontend_recording_start()

and:

obs.obs_frontend_recording_stop()

How to start a virtual camera? (Windows)

Contrary to the above, this is not available directly. But I found somewhere a pretty weird recipe that goes back to C.

Somewhere at the top, you have do this. This, I believe, creates a link to C and writes some code in C that’s necessary to supplement Lua:

local ffi = require 'ffi'
local obsffi
if ffi.os == "OSX" then
obsffi = ffi.load("obs.0.dylib")
else
obsffi = ffi.load("obs")
end
ffi.cdef[[
typedef struct obs_hotkey obs_hotkey_t;
typedef size_t obs_hotkey_id;
const char *obs_hotkey_get_name(const obs_hotkey_t *key);typedef bool (*obs_hotkey_enum_func)(void *data, obs_hotkey_id id, obs_hotkey_t *key);
void obs_enum_hotkeys(obs_hotkey_enum_func func, void *data);
]]

Then, this function does the rest:

function start_virtual_cam()
local target = 'OBSBasic.StartVirtualCam'
local htk_id
function callback_htk(data, id, key)
local name = obsffi.obs_hotkey_get_name(key)
if ffi.string(name) == target then
htk_id = tonumber(id)
return false
else
return true
end
end
local cb = ffi.cast("obs_hotkey_enum_func", callback_htk)
obsffi.obs_enum_hotkeys(cb, data)
if htk_id then
obs.obs_hotkey_trigger_routed_callback(htk_id, false)
obs.obs_hotkey_trigger_routed_callback(htk_id, true)
obs.obs_hotkey_trigger_routed_callback(htk_id, false)
end
end

The code seems to indicate it’s cross-platform, but I only tested this on Windows.

How to check whether a file exists?

function file_exists(name)
local f = io.open(name, "r")
if f ~= nil then
io.close(f)
return true
end
return false
end

How to read from a local file?

local handle = io.open("c:\\Users\\xxx\\Desktop\\yyy.txt", "r")
local result = handle:read("*a")
handle:close()

Note I only tested this on Windows.

How to access a certain URL (Mac)?

local url = "https://aaa.bbb/ccc"
local status = os.execute("curl --connect-timeout 1 --max-time 1 '" .. url .. "' 2>&1 &" )

Please note that I just needed to ping a URL, without worrying about status.

How to access a certain URL without blocking the UI (Mac)?

In OBS, executing commands blocks the main thread, which is really not great. However, accessing a local file doesn’t (or if it does, it’s imperceptible). Therefore, I just ran a local Node script that connected to the server and updated a local file… and then had OBS check the local file instead.

How to mute a video source?

Add a gain filter with a lot of negative gain. (Maybe even more than -30dB?)

local recording_filter_settings = obs.obs_data_create()
obs.obs_data_set_double(recording_filter_settings, "db", -30.0)
local recording_filter =
obs.obs_source_create_private("gain_filter", "Mute",
recording_filter_settings)
obs.obs_source_filter_add(recording, recording_filter)
obs.obs_data_release(recording_filter_settings)
obs.obs_source_release(recording_filter)

What’s audio output and monitoring and how does it work?

I don’t know if I fully understand this, but here’s what I think:

  • By default, all the audio input sources in OBS are in Monitor Off mode, which means they get sent to streaming if you enable streaming (I didn’t use that functionality), and recorded properly. However, they are not actually output through the speakers (“monitored”) since you might not want to e.g. hear yourself again.
  • You can switch any audio mode to Monitor And Output, which means they will both get recorded and output to speakers.
  • You can switch any audio mode to Monitor Only (Mute Output), which means you can only hear it, but it doesn’t get streamed/recorded.
  • You can also just disable any audio input, although I am not sure how.

(If you are the person who arranged the options this way and labeled them, I have some harsh words for you. This is really confusing for something that could be three simple checkboxes.)

Some of the controls are available in the main window, next to meters, and the rest is in the Edit menu (for some reason) when you choose Advanced Audio Properties. Here’s a picture of both:

How do you set it programmatically?

First, you register a handler that tells you when any of those sources get activated or deactivated:

function script_load(settings)
local sh = obs.obs_get_signal_handler()
obs.signal_handler_connect(sh, "source_activate", source_activated)
obs.signal_handler_connect(sh, "source_deactivate", source_deactivated)
end

Then, redirect both of these handlers to one function:

function source_activated(cd)
set_monitoring(cd, true)
end
function source_deactivated(cd)
set_monitoring(cd, false)
end

And then, in that function, turn the monitoring options:

function set_monitoring(cd, enable)
local source = obs.calldata_source(cd, "source")
if source ~= nil then
local source_id = obs.obs_source_get_unversioned_id(source)
if (source_id == 'ffmpeg_source') then if enable then obs.obs_source_set_monitoring_type(source, 2)
-- OBS_MONITORING_TYPE_MONITOR_AND_OUTPUT
else obs.obs_source_set_monitoring_type(source, 0)
-- OBS_MONITORING_TYPE_NONE
end end if (source_id == 'scene') then
obs.obs_source_set_monitoring_type(source, 0)
-- OBS_MONITORING_TYPE_NONE
end
if (source_id == 'wasapi_input_capture') then
obs.obs_source_set_monitoring_type(source, 0)
-- OBS_MONITORING_TYPE_NONE
end
end
end

I found the source ids the same way as I described above. ffmpeg_source was the After Effects video playing, scene was the entire scene, and wasapi_input_capture and wasapi_output_capture was input/output from the little USB device. (This is on Windows.)

How to change the volume or set the audio source to be active or not?

I never got this to work. Fortunately, it wasn’t necessary. But in theory, it’s this:

obs.obs_source_set_volume(source, 0)
obs.obs_source_set_audio_active(source, false)

What is the difference between “scene” and “source” in OBS?

I seriously never got it. Sometimes I think the names are reversed, or don’t match the UI names. I basically button-mashed my code into working.

This works, for example, but I have no idea why, the names of the functions don’t seem to match the output, and at some point I just gave up understanding. Please let me know what I’m missing.

new_scene = obs.obs_scene_create('NAME')
new_something = obs.obs_scene_get_source(new_scene)
new_source = obs.obs_scene_from_source(new_something)

What’s the importance of “local” in front of a variable?

Honestly, no idea. I assume it’s related to scope, but it didn’t seem to matter in my case a lot.

Thank you so much to Lauren Budorick for being patient, and testing the setup with me on multiple occasions. Additional thanks to Basia Wichary and Matt Braithwaite.

The rectangle behind you, a series of articles about interactive presentations.

By Marcin Wichary (@mwichary)

The rectangle behind you

A series of articles about interactive presentations

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store