Export files to YouTube from the command line

Automating your YouTube uploads can save you significant time. In this tutorial, you will learn how to leverage youtubeuploader—a modern, Go-based CLI tool—to efficiently manage your video publishing workflow through the command line.
Important YouTube API limitations
Before diving into the implementation, note these critical restrictions imposed by the YouTube API:
- New API projects are restricted to uploading private videos until verified by Google.
- The default quota permits approximately six video uploads per 24 hours.
- OAuth 2.0 credentials must be correctly configured.
- Unverified apps require adding test users in the OAuth consent screen.
Installing youtubeuploader
Download and install the precompiled youtubeuploader binary for your platform:
# For Linux 64-bit
wget https://github.com/porjo/youtubeuploader/releases/download/v1.24.4/youtubeuploader_1.24.4_linux_amd64.tar.gz
tar xf youtubeuploader_1.24.4_linux_amd64.tar.gz
# For macOS
wget https://github.com/porjo/youtubeuploader/releases/download/v1.24.4/youtubeuploader_1.24.4_darwin_amd64.tar.gz
tar xf youtubeuploader_1.24.4_darwin_amd64.tar.gz
For Windows users, download the appropriate zip file from the releases page and extract it.
Setting up oauth authentication
Configure OAuth 2.0 credentials by following these steps:
- Visit the Google Cloud Console.
- Create a new project or select an existing one.
- Navigate to APIs & Services and enable the YouTube Data API v3.
- Configure the OAuth consent screen by setting your application name, support email, required scopes, and adding test users (for unverified apps).
- Create OAuth credentials by selecting the "Web application" type and adding
http://localhost:8080/oauth2callback
as an authorized redirect URI. - Download the resulting
client_secrets.json
file and place it in your working directory.
Basic upload commands
Below are practical examples to upload videos using youtubeuploader:
# Simple video upload
youtubeuploader -filename video.mp4 -title "My Video"
# Upload with metadata from parameters and a JSON file
youtubeuploader -filename video.mp4 \
-title "My Video" \
-description "Video description" \
-privacy "private" \
-tags "tag1,tag2" \
-metaJSON metadata.json
# Upload with rate limiting (1000 kbps)
youtubeuploader -filename video.mp4 \
-title "My Video" \
-ratelimit 1000
Managing video metadata
Create a metadata.json
file to specify comprehensive video information:
{
"title": "My Video",
"description": "Video description\nWith multiple lines",
"tags": ["tag1", "tag2"],
"privacyStatus": "private",
"madeForKids": false,
"embeddable": true,
"license": "creativeCommon",
"publicStatsViewable": true,
"categoryId": "22",
"recordingdate": "2024-02-05"
}
Upload the video with metadata by running:
youtubeuploader -filename video.mp4 -metaJSON metadata.json
Batch upload script
The following shell script automates the upload of multiple videos:
#!/bin/bash
VIDEO_DIR="./videos"
METADATA_DIR="./metadata"
RATE_LIMIT=1000 # Kbps
upload_with_retry() {
local video=$1
local metadata=$2
local max_attempts=3
local attempt=1
while [ $attempt -le $max_attempts ]; do
if [ -f "$metadata" ]; then
youtubeuploader -filename "$video" \
-metaJSON "$metadata" \
-ratelimit "$RATE_LIMIT" && return 0
else
youtubeuploader -filename "$video" \
-title "$(basename "$video" .mp4)" \
-privacy "private" \
-ratelimit "$RATE_LIMIT" && return 0
fi
echo "Attempt $attempt failed. Retrying in 30 seconds..."
attempt=$((attempt + 1))
sleep 30
done
return 1
}
for video in "$VIDEO_DIR"/*.mp4; do
filename=$(basename "$video")
name="${filename%.*}"
metadata="$METADATA_DIR/$name.json"
echo "Processing $filename"
if upload_with_retry "$video" "$metadata"; then
echo "Successfully uploaded $filename"
else
echo "Failed to upload $filename after multiple attempts"
fi
# Respect YouTube API quotas
sleep 300 # Wait 5 minutes between uploads
done
Make the script executable with:
chmod +x batch-upload.sh
Automated upload integration
This Node.js script watches a directory and automatically uploads new videos:
const { exec } = require('child_process')
const fs = require('fs')
const path = require('path')
const util = require('util')
const execAsync = util.promisify(exec)
const RATE_LIMIT = 1000 // Kbps
const processAndUpload = async (videoPath) => {
try {
const metadata = {
title: path.basename(videoPath, path.extname(videoPath)),
description: `Uploaded on ${new Date().toISOString()}`,
tags: ['auto-upload'],
privacyStatus: 'private',
madeForKids: false,
embeddable: true,
publicStatsViewable: true,
}
const metadataPath = `${videoPath}.json`
fs.writeFileSync(metadataPath, JSON.stringify(metadata, null, 2))
const command = `youtubeuploader -filename "${videoPath}" -metaJSON "${metadataPath}" -ratelimit ${RATE_LIMIT}`
const { stdout, stderr } = await execAsync(command)
if (stderr) {
console.error(`Upload stderr: ${stderr}`)
}
console.log(`Upload successful: ${stdout}`)
fs.unlinkSync(metadataPath)
} catch (error) {
console.error(`Upload failed: ${error.message}`)
}
}
const watchDir = './uploads'
fs.watch(watchDir, (eventType, filename) => {
if (eventType === 'rename' && filename.match(/\.(mp4|mov)$/i)) {
const videoPath = path.join(watchDir, filename)
if (fs.existsSync(videoPath)) {
setTimeout(() => {
processAndUpload(videoPath)
}, 1000) // Wait for file to be fully written
}
}
})
console.log(`Watching ${watchDir} for new videos...`)
Troubleshooting common issues
Rate limiting and bandwidth control
Control the upload bandwidth to avoid network congestion:
# Limit upload speed to 1000 kbps
youtubeuploader -filename video.mp4 -ratelimit 1000
# Limit bandwidth during business hours
youtubeuploader -filename video.mp4 -ratelimit 1000 -limitBetween 9:00-17:00
Authentication issues
If you encounter authentication problems:
- Delete existing OAuth tokens (typically located in
~/.youtubeuploader.json
). - Regenerate
client_secrets.json
from the Google Cloud Console. - Run youtubeuploader again to initiate a new authentication flow.
- Ensure test users are added in the OAuth consent screen if your app remains unverified.
Network and API errors
For intermittent failures:
- Implement retry logic with increasing delays.
- Monitor API quota usage in the Google Cloud Console.
- Use the
-ratelimit
flag to manage bandwidth. - Introduce delays between uploads (a wait time of 300 seconds or more is recommended).
Conclusion
youtubeuploader offers a robust solution for automating YouTube uploads via the command line. Its features for metadata management, rate limiting, and batch processing make it an ideal tool for developers aiming to streamline their video publishing workflow.
For advanced video processing needs before uploading to YouTube, consider using Transloadit's Video Encoding Service to prepare your videos for optimal quality and compatibility.