Export All YouTube Channel Videos And Links In CSV Using Python

Michael Mares
5 min readOct 29, 2023

--

I am currently working on a restructure of my blog https://studentsurvivalguide.dk/. I want to display all the videos we’ve been producing, all in one place. But manually copy-pasting your videos is only an option if you have a few of them.

It gets even more tricky when you want to export statistics such as Likes, Comments and Views, in order to rank the content due to popularity on your site etc.

In this tutorial, I will show you an easy method on how to export any YouTube channel videos into a CSV format using Google Colab. A free Python notebook that sits on your Drive.

This is beginner friendly tutorial and all you need to do is copy my notebook, add your channel ID and the rest will happen automatically.

Let’s start!

Step 1

Enable YouTube API in a Google Cloud Project

Create an account with Google Cloud and create a new project. Once you are in, go into APIs & Services.

Here you need to search for ‘YouTube Data API’. Once you click on the result. You will have an option to click on ‘Enable’ button.

Now we need to create an API key. You can think of this as a virtual ID that you will be showing the YouTube servers that you are who you are.

Never share this with anyone! As soon as someone has an access to your API key, they can act on your behalf. You don’t want that!

Now that we have our API key, we can move on to the next step — Google Colab.

Step 2 — Python

Create a new Google Colab environment or make a copy of mine.

Add your API key from Step 1 into the first cell in the notebook:

youTubeApiKey = "ENTER YOUR API KEY HERE"
youtube = build("youtube","v3",developerKey = youTubeApiKey)

Next, find your channel id.

Use your channel ID as found in a URL: https://youtube.com/channel/UC6bl28l2WcU80Ry2NUMcJBg

id = UC6bl28l2WcU80Ry2NUMcJBg

id= "UC6bl28l2WcU80Ry2NUMcJBg"
channelStats = youtube.channels().list(part = "snippet,contentDetails,statistics", id=id).execute()
#We can define any playlist we want, however here we are using all videos
allUploadedVideosPlaylist = channelStats["items"][0]['contentDetails']['relatedPlaylists']['uploads']

Now that we have extracted the videos from the channel, we need to get their video IDs.

next_page_token = None
playlistData = youtube.playlistItems().list(playlistId=allUploadedVideosPlaylist,
part='snippet',
maxResults=50,
pageToken=next_page_token).execute()

Now we will create a list to store the videos:

videos = [ ]
while True:
playlistData = youtube.playlistItems().list(playlistId=allUploadedVideosPlaylist,
part='snippet',
maxResults=50,
pageToken=next_page_token).execute()
videos += playlistData['items']
next_page_token = playlistData.get('nextPageToken')
if next_page_token is None:
break

Now we can extract the video IDs, which we can use later on to get statistics for each individual video

video_ids=[]
for i in range(len(videos)):
video_ids.append(videos[i]["snippet"]["resourceId"]["videoId"])
i+=1

Now we can get the engagement statistics for each video

videoStatistics = []

for i in range(len(video_ids)):
videoData = youtube.videos().list(id=video_ids[i],part = "statistics").execute()
videoStatistics.append(videoData["items"][0]["statistics"])
i+=1

Next, we can construct multiple lists containing the data from our videos

VideoTitle=[ ]
url=[ ]
Published = [ ]
Views=[ ]
LikeCount=[ ]
DislikeCount=[ ]
Comments=[ ]

for i in range(len(videos)):
VideoTitle.append((videos[i])['snippet']['title'])
url.append("https://www.youtube.com/watch?v="+(videos[i])['snippet']['resourceId']['videoId'])
Published.append((videos[i])['snippet']['publishedAt'])
Views.append(int((videoStatistics[i])['viewCount']))
LikeCount.append(int((videoStatistics[i])['likeCount']))
if ("commentCount" in videoStatistics[i]):
Comments.append(int((videoStatistics[i])['commentCount']))
else:
Comments.append(0)
data={"Video Title" : VideoTitle, "Video url" : url, "Published" : Published, "Views" : Views, "Like Count" : LikeCount, "Comments" : Comments}
df=pd.DataFrame(data)

df

This gives us a nice output of all the videos on the channel. (or the first 50 as we have defined in the first step)

Sidenote: I have just noticed that this does not export the unlisted videos from YouTube. That makes sense, though!

Step 3. Let’s now export this data to an excel sheet

But what good is the data in a Python notebook, if we cannot access it?

Hint: there is actually quite a lot of things we can do with this data, for example we can get all the transcripts, use ChatGPT to tokenize it and then use this tokenization to divide our content into different categories.

Authentication:

One cool thing about Google colab is that it comes with a built-in auth module. This means that you do not have to use service accounts or build your own authentication modules from scratch. You simple run the code and you get a standard prompt to authenticate yourself.

#Authenticate to send data into Google Sheets
from google.colab import auth
import gspread
import json

auth.authenticate_user()

from google.auth import default
creds, _ = default()

gc = gspread.authorize(creds)
Click on ‘Allow’

Now you can go through the standard authentication flow. Select the account where the Google Sheet is and finish the authentication.

Now you need to create a new sheet and copy its URL. You can use this nifty shortcut to do it faster: https://sheets.new/

This will open a new Google Sheet for you.

Now we just need to turn our data into a json and send it to Google Sheet.

Replace the sheet_url and target_sheet variables with the URL for your new sheet and your desired Sheet name.

import gspread
from gspread.exceptions import WorksheetNotFound

sheet_url = "YOUR_SHEET_URL"
target_sheet = "YouTube Videos"

sheet = gc.open_by_url(sheet_url)

# Attempt to get the worksheet, and if not present, create it
try:
worksheet1 = sheet.worksheet(target_sheet)
except WorksheetNotFound:
worksheet1 = sheet.add_worksheet(title=target_sheet, rows="100", cols="20")

# Convert the data frame to a list of lists
df_values = df.values.tolist()
print(json.dumps(df_values))

# Append the rows to the sheet
worksheet1.append_rows(df_values)
worksheet1.insert_row(df.columns.values.tolist(), 1) # Insert column headers at the top
All channel’s YouTube videos in one place

If you followed the above steps, now you have successfully copied all of the channels’ videos into a sheet.

Hooray! 🥳

Disclaimer: I’ve spent 3 hours on coding something that I could have likely copy-pasted in 40 minutes but remember, always work smart, not hard 😏

--

--

Michael Mares

I like to explore and challenge my worldview about technology, money and mental health.