Creating a Smooth and Interactive Video Timeline with React Native and FFmpeg
TLDR — you will learn the following:
1️⃣ Install and run ffmpeg in react native projects which work both on Android and iOS.
2️⃣ Learn two secret tricks to render the video timeline correctly.
and you can access the final version of the code on github.
Why create a video timeline on React Native?
In 2020, Nicolas and I were brainstorming ideas to play around with our habit of sending each other short clips of our skateboarding flat-ground tricks over messenger.
To record these, we usually have to put our phones on the ground and start recording, which results in very long videos with many failed attempts before one successful trick lands. In the end, only the 4 seconds of the landed trick are worth keeping.
With little to no experience in video editing on mobile, we started completing the first Origami Studio prototype.
The core loop was complete. Once the video is selected, the user can quickly trim the video by placing the selected window around “the pop.” The pop is the moment the board hits the ground.
We tested the experience on a few skateboarding friends. Still, as time passed, we focused our effort on the Stoke Club app… Then came November 2022, when too determined to put this prototype in a usable app, I found some time to create this trimming experience in React Native.
I shared the result on Reddit, hitting the top post of the week, and the comments I had from the community motivated me to write a how-to guide.
What do we want to achieve?
Along with the core user loop described in the previous section come some specifications. Namely, the following:
- Given the video is loaded, then we see that the timeline is placed at the start of the duration window.
- Given the video timeline goes out of the duration window, then it bounces back to its borders.
- Given the video is loaded, then the video loops only in the duration window.
- Given the user scrolls the timeline, then the video scrubs to the frame represented on the pop line at the center of the duration window.
- Given the user releases the scroll, then the video plays again, starting from the “pop” line and looping forward.
With this in mind, let’s finally jump into the code.
How can I code this?
While generating the timeline was easy with Origami Studio, I had little idea of how to implement this within React Native. Until I hear about FFmpeg.
I’ll describe the coding steps below:
- Pre-requisites: React Native & Image picker
- Use FFmpeg in React Native
- Completing the timeline
1. Pre-requisites
Start a react native project
npx react-native init VideoTimelineExample
Follow the instructions from react-native documentation on both targets iOS and Android.
Run on both targets to check if everything is ok.
npx react-native run-android
npx react-native run-ios
You should get the React Native default screens.
Add image picker
To select a video from the library, we will use RN Image Crop Picker. Follow the installation steps for this library described in their installation section.
First, make a simple screen with a button in the center by modifying App.js.
// App.js
import React from 'react';
import {SafeAreaView, View, Text, StyleSheet} from 'react-native';
const App = () => {
return (
<SafeAreaView style={styles.mainContainer}>
<View style={styles.buttonContainer}>
<Text style={styles.buttonText}>Select a video</Text>
</View>
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
});
export default App;
Then, let’s add the image picker handler to the button. We are going to set a state with minimum video information: URI, fileName, and creationDate.
// App.js
import React, {useState} from 'react';
import {SafeAreaView, Pressable, Text, StyleSheet} from 'react-native';
import ImagePicker from 'react-native-image-crop-picker';
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(null); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
return (
<SafeAreaView style={styles.mainContainer}>
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
});
export default App;
And finally, we will display the selected video so that it loops. For this step, we will use react-native-video. Follow their installation instructions in their docs.
// App.js
import React, {useState} from 'react';
import {
SafeAreaView,
Dimensions,
Pressable,
Text,
StyleSheet,
View,
} from 'react-native';
import ImagePicker from 'react-native-image-crop-picker';
import Video from 'react-native-video';
const SCREEN_WIDTH = Dimensions.get('screen').width;
const SCREEN_HEIGHT = Dimensions.get('screen').height;
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(null); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
return (
<SafeAreaView style={styles.mainContainer}>
{selectedVideo ? (
<View style={styles.videoContainer}>
<Video
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
/>
</View>
) : (
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
)}
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
videoContainer: {
width: SCREEN_WIDTH,
height: 0.6 * SCREEN_HEIGHT,
backgroundColor: 'rgba(255,255,255,0.1)',
alignItems: 'center',
justifyContent: 'center',
backgroundColor: 'green',
},
video: {
height: '100%',
width: '100%',
},
});
export default App;
2. Use FFmpeg in React Native
What is FFmpeg
FFmpeg is an open-source software project that provides a complete solution for recording, converting, and streaming audio and video. It is widely used for transcoding multimedia files, allowing users to convert files between different formats and play them on different devices. It also provides a set of libraries and tools that can be used to develop media applications — according to Chat GPT. And this is the last part that is of interest to us. We will use FFmpeg to generate images from a given video, creating the frames for the timeline.
Install and test FFmpeg
Start by installing ffmpeg-kit-react-native
. Refer to their documentation to install.
yarn add ffmpeg-kit-react-native
Note: To run it on Android, you must modify the min SDK version to 24 in the build.gradle file, like below:
minSdkVersion = 24
Now, we will start using FFmpeg by running a command once the video is loaded. Let’s add an handleVideoLoad handler.
import React, {useState} from 'react';
import {
SafeAreaView,
Dimensions,
Pressable,
Text,
StyleSheet,
View,
} from 'react-native';
import RNFS from 'react-native-fs';
import {FFmpegKit, ReturnCode, FFmpegKitConfig} from 'ffmpeg-kit-react-native';
import ImagePicker from 'react-native-image-crop-picker';
import Video from 'react-native-video';
const SCREEN_WIDTH = Dimensions.get('screen').width;
const SCREEN_HEIGHT = Dimensions.get('screen').height;
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(null); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
const handleVideoLoad = () => {
let outputImagePath = `${RNFS.CachesDirectoryPath}/${selectedVideo.localFileName}.mp4`;
FFmpegKit.execute(
`-i ${selectedVideo.uri} -c:v mpeg4 ${outputImagePath}`,
).then(async session => {
const state = FFmpegKitConfig.sessionStateToString(
await session.getState(),
);
const returnCode = await session.getReturnCode();
const failStackTrace = await session.getFailStackTrace();
const duration = await session.getDuration();
if (ReturnCode.isSuccess(returnCode)) {
console.log(
`Encode completed successfully in ${duration} milliseconds;`,
);
} else if (ReturnCode.isCancel(returnCode)) {
console.log('Encode canceled');
} else {
console.log(
`Encode failed with state ${state} and rc ${returnCode}.${
(failStackTrace, '\\n')
}`,
);
}
});
};
return (
<SafeAreaView style={styles.mainContainer}>
{selectedVideo ? (
<View style={styles.videoContainer}>
<Video
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
onLoad={handleVideoLoad}
/>
</View>
) : (
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
)}
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
videoContainer: {
width: SCREEN_WIDTH,
height: 0.6 * SCREEN_HEIGHT,
backgroundColor: 'rgba(255,255,255,0.1)',
alignItems: 'center',
justifyContent: 'center',
backgroundColor: 'green',
},
video: {
height: '100%',
width: '100%',
},
});
export default App;
Look at handleVideoLoad. We use an FFmpeg command, the one given in the lib documentation, to check if everything runs appropriately on both targets. And it should!
Let’s mention that we use react-native-fs to handle the file system part. Be sure to install it.
Once you pick a video, it runs the command, and you should get the following log:
Encode completed successfully in XXXX milliseconds;
Now, we have our first FFmpeg command working in React Native! Voilà! The one we used above is only copying the video file elsewhere. By the way, you can check it by opening the path given by the logs:
LOG Output #0, mp4, to '/data/user/0/com.videotimelineexample/cache/RPReplay_Final1668803007.mp4':
For our timeline, we want to generate one image from the video per second. To do this, we need to modify the FFmpeg command. You can see the extra-long FFmpeg documentation to check all the parameters available. In our case, we will use the following command:
-ss 0 -i ${videoURI} -vf "fps=${FRAME_PER_SEC}/1:round=up,scale=${FRAME_WIDTH}:-2" -vframes ${frameNumber} ${outputImagePath}
Some explanations about the above command:
- -ss: starts at second 0
- -i: URI of the input file
- -vf: video filters. We only need one frame per second in our case, so FRAME_PER_SEC is set to 1. About scale, we only need a small size for our timeline. It will be 40px wide, so we can make FRAME_WIDTH set to 80 so that we handle 2x resolution screens.
- -vframes: number of frames we want. We know that we only need one frame per second. So we will base it on the video duration.
We want to extract the code about FFmpeg into a static class to separate the concerns.
// lib/FFmpeg.js
import {FFmpegKit, FFmpegKitConfig, ReturnCode} from 'ffmpeg-kit-react-native';
import RNFS from 'react-native-fs';
import {
FRAME_PER_SEC,
FRAME_WIDTH,
} from '../App.js';
class FFmpegWrapper {
static getFrames(
localFileName,
videoURI,
frameNumber,
successCallback,
errorCallback,
) {
let outputImagePath = `${RNFS.CachesDirectoryPath}/${localFileName}_%4d.png`;
const ffmpegCommand = `-ss 0 -i ${videoURI} -vf "fps=${FRAME_PER_SEC}/1:round=up,scale=${FRAME_WIDTH}:-2" -vframes ${frameNumber} ${outputImagePath}`;
FFmpegKit.executeAsync(
ffmpegCommand,
async session => {
const state = FFmpegKitConfig.sessionStateToString(
await session.getState(),
);
const returnCode = await session.getReturnCode();
const failStackTrace = await session.getFailStackTrace();
const duration = await session.getDuration();
if (ReturnCode.isSuccess(returnCode)) {
console.log(
`Encode completed successfully in ${duration} milliseconds;.`,
);
console.log(`Check at ${outputImagePath}`);
successCallback(outputImagePath);
} else {
console.log('Encode failed. Please check log for the details.');
console.log(
`Encode failed with state ${state} and rc ${returnCode}.${
(failStackTrace, '\\n')
}`,
);
errorCallback();
}
},
log => {
console.log(log.getMessage());
},
statistics => {
console.log(statistics);
},
).then(session =>
console.log(
`Async FFmpeg process started with sessionId ${session.getSessionId()}.`,
),
);
}
}
export default FFmpegWrapper;
About the outputFilePath, you see that we are using a specific notation:
${RNFS.CachesDirectoryPath}/${localFileName}_%4d.png
Note the %4d — it enables us to suffix our generated image with a number automatically incremented containing four digits.
And we can simplify App.js by modifying handleVideoLoad:
// in App.js
const handleVideoLoad = videoAssetLoaded => {
const numberOfFrames = Math.ceil(videoAssetLoaded.duration);
FFmpegWrapper.getFrames(
selectedVideo.localFileName,
selectedVideo.uri,
numberOfFrames,
filePath => {
console.log('Empty success callback');
},
);
};
At this point, you also need to add two constants to App.js:
export const FRAME_PER_SEC = 1;
export const FRAME_WIDTH = 80;
Let’s select a video again and see the result in the logs:
Nice, we have our images generated. FFmpeg does exactly what we want. Let’s see how we integrate the frames to complete the timeline component.
3. Completing the timeline
Completing the timeline means three things:
- Displaying the frames,
- Adding the duration window,
- Handling events on the timeline — like the interactions with the video selected.
Displaying the frames
For displaying the frames, we will make use of the success callback left empty. We add a state variable frames, which will be filled with loading items when the FFmpeg computes the images and will display once done.
To do so, modify handleVideoLoad :
const handleOnLoad = videoAssetLoaded => {
const numberOfFrames = Math.ceil(videoAssetLoaded.duration);
setFrames(
Array(numberOfFrames).fill({
status: FRAME_STATUS.LOADING.name.description,
}),
);
FFmpegWrapper.getFrames(
selectedVideo.localFileName,
selectedVideo.uri,
numberOfFrames,
filePath => {
const _frames = [];
for (let i = 0; i < numberOfFrames; i++) {
_frames.push(
`${filePath.replace('%4d', String(i + 1).padStart(4, 0))}.png`,
);
}
setFrames(_frames);
},
);
};
You will need to set FRAME_STATUS like below:
const FRAME_STATUS = Object.freeze({
LOADING: {name: Symbol('LOADING')},
READY: {name: Symbol('READY')},
});
Which gives a complete App.js file like the one below:
// App.js
import React, {useState} from 'react';
import {
SafeAreaView,
Dimensions,
Pressable,
Text,
StyleSheet,
View,
} from 'react-native';
import RNFS from 'react-native-fs';
import {FFmpegKit, ReturnCode, FFmpegKitConfig} from 'ffmpeg-kit-react-native';
import ImagePicker from 'react-native-image-crop-picker';
import Video from 'react-native-video';
import FFmpegWrapper from './lib/FFmpeg';
const SCREEN_WIDTH = Dimensions.get('screen').width;
const SCREEN_HEIGHT = Dimensions.get('screen').height;
export const FRAME_PER_SEC = 1;
export const FRAME_WIDTH = 80;
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const FRAME_STATUS = Object.freeze({
LOADING: {name: Symbol('LOADING')},
READY: {name: Symbol('READY')},
});
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const [frames, setFrames] = useState(); // <[{status: <FRAME_STATUS>}]>
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
const handleVideoLoad = videoAssetLoaded => {
const numberOfFrames = Math.ceil(videoAssetLoaded.duration);
setFrames(
Array(numberOfFrames).fill({
status: FRAME_STATUS.LOADING.name.description,
}),
);
FFmpegWrapper.getFrames(
selectedVideo.localFileName,
selectedVideo.uri,
numberOfFrames,
filePath => {
const _frames = [];
for (let i = 0; i < numberOfFrames; i++) {
_frames.push(
`${filePath.replace('%4d', String(i + 1).padStart(4, 0))}.png`,
);
}
setFrames(_frames);
},
);
};
return (
<SafeAreaView style={styles.mainContainer}>
{selectedVideo ? (
<View style={styles.videoContainer}>
<Video
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
onLoad={handleVideoLoad}
/>
</View>
) : (
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
)}
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
videoContainer: {
width: SCREEN_WIDTH,
height: 0.6 * SCREEN_HEIGHT,
backgroundColor: 'rgba(255,255,255,0.1)',
alignItems: 'center',
justifyContent: 'center',
backgroundColor: 'green',
},
video: {
height: '100%',
width: '100%',
},
});
export default App;
We will use a ScrollView to display the frames placed under the video container:
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}
>
{frames.map((frame, index) => {
if (frame.status === FRAME_STATUS.LOADING.name.description) {
return <View style={styles.loadingFrame} key={index}></View>;
} else {
return (
<Image
key={index}
source={{
uri: 'file://' + frame,
}}
style={{
width: TILE_WIDTH,
height: TILE_HEIGHT,
}}
/>
);
}
})}
</ScrollView>
Also, we will isolate the rendering of the frames in a renderFrame function.
Below is the complete App.js file with the correct styling:
// App.js
import React, {useState} from 'react';
import {
SafeAreaView,
Dimensions,
Pressable,
Text,
StyleSheet,
View,
ScrollView,
Image,
} from 'react-native';
import ImagePicker from 'react-native-image-crop-picker';
import Video from 'react-native-video';
import FFmpegWrapper from './lib/FFmpeg';
const SCREEN_WIDTH = Dimensions.get('screen').width;
const SCREEN_HEIGHT = Dimensions.get('screen').height;
export const FRAME_PER_SEC = 1;
export const FRAME_WIDTH = 80;
const TILE_HEIGHT = 80;
const TILE_WIDTH = FRAME_WIDTH / 2; // to get a 2x resolution
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const FRAME_STATUS = Object.freeze({
LOADING: {name: Symbol('LOADING')},
READY: {name: Symbol('READY')},
});
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const [frames, setFrames] = useState(); // <[{status: <FRAME_STATUS>}]>
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
const handleVideoLoad = videoAssetLoaded => {
const numberOfFrames = Math.ceil(videoAssetLoaded.duration);
setFrames(
Array(numberOfFrames).fill({
status: FRAME_STATUS.LOADING.name.description,
}),
);
FFmpegWrapper.getFrames(
selectedVideo.localFileName,
selectedVideo.uri,
numberOfFrames,
filePath => {
const _frames = [];
for (let i = 0; i < numberOfFrames; i++) {
_frames.push(
`${filePath.replace('%4d', String(i + 1).padStart(4, 0))}.png`,
);
}
setFrames(_frames);
},
);
};
const renderFrame = (frame, index) => {
if (frame.status === FRAME_STATUS.LOADING.name.description) {
return <View style={styles.loadingFrame} key={index}></View>;
} else {
return (
<Image
key={index}
source={{
uri: 'file://' + frame,
}}
style={{
width: TILE_WIDTH,
height: TILE_HEIGHT,
}}
/>
);
}
};
return (
<SafeAreaView style={styles.mainContainer}>
{selectedVideo ? (
<>
<View style={styles.videoContainer}>
<Video
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
onLoad={handleVideoLoad}
/>
</View>
{frames && (
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}>
{frames.map((frame, index) => renderFrame(frame, index))}
</ScrollView>
)}
</>
) : (
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
)}
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
videoContainer: {
width: SCREEN_WIDTH,
height: 0.6 * SCREEN_HEIGHT,
backgroundColor: 'rgba(255,255,255,0.1)',
alignItems: 'center',
justifyContent: 'center',
},
video: {
height: '100%',
width: '100%',
},
framesLine: {
width: SCREEN_WIDTH,
},
loadingFrame: {
width: TILE_WIDTH,
height: TILE_HEIGHT,
backgroundColor: 'rgba(0,0,0,0.05)',
borderColor: 'rgba(0,0,0,0.1)',
borderWidth: 1,
},
});
export default App;
Adding the duration window
The duration window is this yellow component. It lies on top of the timeline.
Nothing complicated about showing this component along with the ‘pop line’, see below:
// ...
export const FRAME_PER_SEC = 1;
export const FRAME_WIDTH = 80;
const TILE_HEIGHT = 80;
const TILE_WIDTH = FRAME_WIDTH / 2; // to get a 2x resolution
const DURATION_WINDOW_DURATION = 4;
const DURATION_WINDOW_BORDER_WIDTH = 4;
const DURATION_WINDOW_WIDTH =
DURATION_WINDOW_DURATION * FRAME_PER_SEC * TILE_WIDTH;
const POPLINE_POSITION = '50%';
// ...
const App = () => {
// ...
{frames && (
<View style={styles.durationWindowAndFramesLineContainer}>
<View style={styles.durationWindow}>
<View style={styles.durationLabelContainer}>
<Text style={styles.durationLabel}>
{DURATION_WINDOW_DURATION} sec.
</Text>
</View>
</View>
<View style={styles.popLineContainer}>
<View style={styles.popLine} />
</View>
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}>
{frames.map((frame, index) => renderFrame(frame, index))}
</ScrollView>
</View>
)}
// ...
};
const styles = StyleSheet.create({
// ...
durationWindowAndFramesLineContainer: {
top: -DURATION_WINDOW_BORDER_WIDTH,
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
justifyContent: 'center',
},
durationWindow: {
width: DURATION_WINDOW_WIDTH,
borderColor: 'yellow',
borderWidth: DURATION_WINDOW_BORDER_WIDTH,
borderRadius: 4,
height: TILE_HEIGHT,
alignSelf: 'center',
zIndex: 25,
},
durationLabelContainer: {
backgroundColor: 'yellow',
alignSelf: 'center',
top: -26,
paddingHorizontal: 8,
paddingVertical: 4,
borderTopLeftRadius: 4,
borderTopRightRadius: 4,
},
durationLabel: {
color: 'rgba(0,0,0,0.6)',
fontWeight: '700',
},
popLineContainer: {
position: 'absolute',
alignSelf: POPLINE_POSITION === '50%' && 'center', // otherwise raise Error("Not implemented"),
zIndex: 25,
},
popLine: {
width: 3,
height: TILE_HEIGHT,
backgroundColor: 'yellow',
},
});
⚠️ But, we have two problems:
- We can’t scroll the timeline to the left — it bounces back immediately.
- We can’t scroll when touching on the duration window.
🤫 Find below the 2 tricks to cope with these problems
For problem 1, the trick is to prepend an element of a given width (half the screen minus half the duration window). And to append a similar one.
In other words, you should consider your frames like this:
For problem 2, the trick is to place the duration window below the timeline and to only add “borders elements” on top of the frames line. Now that the duration window is behind, we also need to add a little bit to its height.
Finally, to get the proper rendering on both Android and iOS, we will need to be careful about z positions — let’s set arbitrarily the following ones:
- For the video — videoContainer: zIndex to 0
- For the duration window — durationWindowAndFramesLineContainer: zIndex to 10
- For anything we want above the timeline, i.e the 2 duration window borders and the popline — poplineContainer, durationWindowLeftBorder, durationWindowRightBorder: zIndex to25
<View style={styles.durationWindowLeftBorder}/>
<View style={styles.durationWindowRightBorder}/>
// in styles...
durationWindowLeftBorder: {
position: 'absolute',
width: DURATION_WINDOW_BORDER_WIDTH,
alignSelf: 'center',
height: TILE_HEIGHT,
left: SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
borderTopLeftRadius: 8,
borderBottomLeftRadius: 8,
backgroundColor: '#FFED50',
zIndex: 25,
},
durationWindowRightBorder: {
position: 'absolute',
width: DURATION_WINDOW_BORDER_WIDTH,
right: SCREEN_WIDTH - SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
borderTopRightRadius: 8,
borderBottomRightRadius: 8,
backgroundColor: '#FFED50',
zIndex: 25,
},
Note: we will also need to remove centering of items in the videoContainer class.
This leads us to the following complete code below:
// App.js
import React, {useState} from 'react';
import {
SafeAreaView,
Dimensions,
Pressable,
Text,
StyleSheet,
View,
ScrollView,
Image,
} from 'react-native';
import ImagePicker from 'react-native-image-crop-picker';
import Video from 'react-native-video';
import FFmpegWrapper from './lib/FFmpeg';
const SCREEN_WIDTH = Dimensions.get('screen').width;
const SCREEN_HEIGHT = Dimensions.get('screen').height;
export const FRAME_PER_SEC = 1;
export const FRAME_WIDTH = 80;
const TILE_HEIGHT = 80;
const TILE_WIDTH = FRAME_WIDTH / 2; // to get a 2x resolution
const DURATION_WINDOW_DURATION = 4;
const DURATION_WINDOW_BORDER_WIDTH = 4;
const DURATION_WINDOW_WIDTH =
DURATION_WINDOW_DURATION * FRAME_PER_SEC * TILE_WIDTH;
const POPLINE_POSITION = '50%';
const getFileNameFromPath = path => {
const fragments = path.split('/');
let fileName = fragments[fragments.length - 1];
fileName = fileName.split('.')[0];
return fileName;
};
const FRAME_STATUS = Object.freeze({
LOADING: {name: Symbol('LOADING')},
READY: {name: Symbol('READY')},
});
const App = () => {
const [selectedVideo, setSelectedVideo] = useState(); // {uri: <string>, localFileName: <string>, creationDate: <Date>}
const [frames, setFrames] = useState(); // <[{status: <FRAME_STATUS>, uri: <string>}]>
const handlePressSelectVideoButton = () => {
ImagePicker.openPicker({
mediaType: 'video',
}).then(videoAsset => {
console.log(`Selected video ${JSON.stringify(videoAsset, null, 2)}`);
setSelectedVideo({
uri: videoAsset.sourceURL || videoAsset.path,
localFileName: getFileNameFromPath(videoAsset.path),
creationDate: videoAsset.creationDate,
});
});
};
const handleVideoLoad = videoAssetLoaded => {
const numberOfFrames = Math.ceil(videoAssetLoaded.duration);
setFrames(
Array(numberOfFrames).fill({
status: FRAME_STATUS.LOADING.name.description,
}),
);
FFmpegWrapper.getFrames(
selectedVideo.localFileName,
selectedVideo.uri,
numberOfFrames,
filePath => {
const _framesURI = [];
for (let i = 0; i < numberOfFrames; i++) {
_framesURI.push(
`${filePath.replace('%4d', String(i + 1).padStart(4, 0))}`,
);
}
const _frames = _framesURI.map(_frameURI => ({
uri: _frameURI,
status: FRAME_STATUS.READY.name.description,
}));
setFrames(_frames);
},
);
};
const renderFrame = (frame, index) => {
if (frame.status === FRAME_STATUS.LOADING.name.description) {
return <View style={styles.loadingFrame} key={index} />;
} else {
return (
<Image
key={index}
source={{uri: 'file://' + frame.uri}}
style={{
width: TILE_WIDTH,
height: TILE_HEIGHT,
}}
onLoad={() => {
console.log('Image loaded');
}}
/>
);
}
};
return (
<SafeAreaView style={styles.mainContainer}>
{selectedVideo ? (
<>
<View style={styles.videoContainer}>
<Video
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
onLoad={handleVideoLoad}
/>
</View>
{frames && (
<View style={styles.durationWindowAndFramesLineContainer}>
<View style={styles.durationWindow}>
<View style={styles.durationLabelContainer}>
<Text style={styles.durationLabel}>
{DURATION_WINDOW_DURATION} sec.
</Text>
</View>
</View>
<View style={styles.popLineContainer}>
<View style={styles.popLine} />
</View>
<View style={styles.durationWindowLeftBorder} />
<View style={styles.durationWindowRightBorder} />
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}>
<View style={styles.prependFrame} />
{frames.map((frame, index) => renderFrame(frame, index))}
<View style={styles.appendFrame} />
</ScrollView>
</View>
)}
</>
) : (
<Pressable
style={styles.buttonContainer}
onPress={handlePressSelectVideoButton}>
<Text style={styles.buttonText}>Select a video</Text>
</Pressable>
)}
</SafeAreaView>
);
};
const styles = StyleSheet.create({
mainContainer: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
buttonContainer: {
backgroundColor: '#000',
paddingVertical: 12,
paddingHorizontal: 32,
borderRadius: 16,
},
buttonText: {
color: '#fff',
},
videoContainer: {
width: SCREEN_WIDTH,
height: 0.6 * SCREEN_HEIGHT,
backgroundColor: 'rgba(255,255,255,0.1)',
zIndex: 0,
},
video: {
height: '100%',
width: '100%',
},
durationWindowAndFramesLineContainer: {
top: -DURATION_WINDOW_BORDER_WIDTH,
width: SCREEN_WIDTH,
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
justifyContent: 'center',
zIndex: 10,
},
durationWindow: {
width: DURATION_WINDOW_WIDTH,
borderColor: 'yellow',
borderWidth: DURATION_WINDOW_BORDER_WIDTH,
borderRadius: 4,
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
alignSelf: 'center',
},
durationLabelContainer: {
backgroundColor: 'yellow',
alignSelf: 'center',
top: -26,
paddingHorizontal: 8,
paddingVertical: 4,
borderTopLeftRadius: 4,
borderTopRightRadius: 4,
},
durationLabel: {
color: 'rgba(0,0,0,0.6)',
fontWeight: '700',
},
popLineContainer: {
position: 'absolute',
alignSelf: POPLINE_POSITION === '50%' && 'center',
zIndex: 25,
},
popLine: {
width: 3,
height: TILE_HEIGHT,
backgroundColor: 'yellow',
},
durationWindowLeftBorder: {
position: 'absolute',
width: DURATION_WINDOW_BORDER_WIDTH,
alignSelf: 'center',
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
left: SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
borderTopLeftRadius: 8,
borderBottomLeftRadius: 8,
backgroundColor: 'yellow',
zIndex: 25,
},
durationWindowRightBorder: {
position: 'absolute',
width: DURATION_WINDOW_BORDER_WIDTH,
right: SCREEN_WIDTH - SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
height: TILE_HEIGHT + DURATION_WINDOW_BORDER_WIDTH * 2,
borderTopRightRadius: 8,
borderBottomRightRadius: 8,
backgroundColor: 'yellow',
zIndex: 25,
},
framesLine: {
width: SCREEN_WIDTH,
position: 'absolute',
},
loadingFrame: {
width: TILE_WIDTH,
height: TILE_HEIGHT,
backgroundColor: 'rgba(0,0,0,0.05)',
borderColor: 'rgba(0,0,0,0.1)',
borderWidth: 1,
},
prependFrame: {
width: SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
},
appendFrame: {
width: SCREEN_WIDTH / 2 - DURATION_WINDOW_WIDTH / 2,
},
});
export default App;
Perfect, let’s now focus on interactions between the timeline and the video.
Interacting with the timeline
If we get back to our specifications, we still have to complete the ones dealing with video interactions:
- ✅ Given the video is loaded, then we see that the timeline is placed at the start of the duration window.
- ✅ Given the video timeline goes out of the duration window, then it bounces back to its borders.
- Given the video is loaded, then the video loops only in the duration window.
- Given the user scrolls the timeline, then the video scrubs to the frame represented on the pop line at the center of the duration window.
- Given the user releases the scroll, then the video plays again, starting from the “pop” line and looping forward.
For spec 3 — we need to check the current video time. When this current time is over the time represented at the right border, then we need to seek the video back to the time represented at the left border. So first, we will need 3 things: one event handler which keeps the offset of the timeline updated in the state, and two functions that gives the time represented at the borders given this offset.
const handleOnScroll = ({nativeEvent}) => {
setFramesLineOffset(nativeEvent.contentOffset.x);
};
const getLeftLinePlayTime = offset => {
return offset / (FRAME_PER_SEC * TILE_WIDTH);
};
const getRightLinePlayTime = offset => {
return (offset + DURATION_WINDOW_WIDTH) / (FRAME_PER_SEC * TILE_WIDTH);
};
We use them in the video event handler onProgress:
const videoPlayerRef = useRef();
// ...
const handleOnProgress = ({currentTime}) => {
if (currentTime >= getRightLinePlayTime(framesLineOffset)) {
videoPlayerRef.current.seek(getLeftLinePlayTime(framesLineOffset));
}
};
// ...
<Video
ref={videoPlayerRef}
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
onLoad={handleVideoLoad}
onProgress={handleOnProgress}
/>
And we place the onScroll event handler in the ScrollView element after initialising the state variable framesLineOffset:
const [framesLineOffset, setFramesLineOffset] = useState(0); // number
// ...
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}
onScroll={handleOnScroll}>
<View style={styles.prependFrame} />
{frames.map((frame, index) => renderFrame(frame, index))}
<View style={styles.appendFrame} />
</ScrollView>
Finally (youhou!), for spec 4 & 5 — we will only need to seek the playback time at the time represented by the “pop line”. So, it will only be two lines added to the handleOnScroll event handler + a similar function like getLeftLinePlayTime or getRightLinePlayTime but for the time represented at the “pop line”:
const getPopLinePlayTime = offset => {
return (
(offset + (DURATION_WINDOW_WIDTH * parseFloat(POPLINE_POSITION)) / 100) /
(FRAME_PER_SEC * TILE_WIDTH)
);
};
const handleOnScroll = ({nativeEvent}) => {
const playbackTime = getPopLinePlayTime(nativeEvent.contentOffset.x);
videoPlayerRef.current?.seek(playbackTime);
setFramesLineOffset(nativeEvent.contentOffset.x);
};
And to make clear to the user that we are seeking the video at the pop line when scrolling, we will pause the video when the user touches the timeline, using onTouchStart and onTouchEnd (and add onMomentumScrollEnd to have an ok experience on Android devices) ScrollView event handlers:
const [paused, setPaused] = useState(false);
//...
const handleOnTouchEnd = () => {
setPaused(false);
};
const handleOnTouchStart = () => {
setPaused(true);
};
//...
<Video
ref={videoPlayerRef}
style={styles.video}
resizeMode={'cover'}
source={{uri: selectedVideo.uri}}
repeat={true}
paused={paused}
onLoad={handleVideoLoad}
onProgress={handleOnProgress}
/>
//...
<ScrollView
showsHorizontalScrollIndicator={false}
horizontal={true}
style={styles.framesLine}
alwaysBounceHorizontal={true}
scrollEventThrottle={1}
onScroll={handleOnScroll}
onTouchStart={handleOnTouchStart}
onTouchEnd={handleOnTouchEnd}
onMomentumScrollEnd={handleOnTouchEnd}>
🙌 All specs: completed!
- ✅ Given the video is loaded, then we see that the timeline is placed at the start of the duration window.
- ✅ Given the video timeline goes out of the duration window, then it bounces back to its borders.
- ✅ Given the video is loaded, then the video loops only in the duration window.
- ✅ Given the user scrolls the timeline, then the video scrubs to the frame represented on the pop line at the center of the duration window.
- ✅ Given the user releases the scroll, then the video plays again, starting from the “pop” line and looping forward.
Et voilà! You have everything you need to create a video app with a timeline for iOS and Android.
Find the complete version on Github.
Some limits
- Async frame generation. Currently the frames generated by ffmpeg display once they are all computed. That would be better to asynchronously generate each one and display then when ready.
- Android emulator. To test on Android, I only relied on the emulator and add a video to the Google Photo app by drag and dropping from my desktop. On Google Photo, the video stalls at random time, which also appears when playing the video with React Native Video. I expect it to solve itself with native phone video from a real device.
- Playing animation. As you can see on the real app demoed above, there is a playing cursor animation. But I am not satisfied with it yet. It is an important part of UX and I want to focus deeper on it before presenting it. This could lead to a new dedicated post 🔥
Thanks for reading!
We are Nollie Studio — a mobile experience studio focused on rider communities. If you want to work with us, you can find us from our collective.work page.
And if you enjoy the read, give us a clap or share it with your friends 🙏.