The /s3/store Robot

We export to the storage platform of your choice.

The /s3/store Robot exports encoding results to Amazon S3.

Parameters

Name Type Default Description
credentials (required) String Please create your associated Template Credentials in your Transloadit account and use the name of your Template Credentials as this parameter's value. They will contain the values for your S3 bucket, key, secret and bucket region. While we recommend to use Template Credentials at all times, some use cases demand dynamic credentials for which using Template Credentials with their static nature is too unwieldy. If you have this requirement, feel free to use the following parameters instead: "bucket", "bucket_region" (for example: "us-east-1" or "eu-west-2"), "key", "secret".
path String "${unique_prefix}/${file.url_name}" The path at which the file is to be stored. This may include any available Assembly variables.
url_prefix String "http://{bucket}.s3.amazonaws.com/" The URL prefix used for the returned URL, such as "http://my.cdn.com/some/path".
acl String "public-read" The permissions used for this file. This can be "public-read", "public", "private" or "bucket-default".
headers Object { "Content-Type": file.mime } A JavaScript object containing a list of headers to be set for this file on S3, such as { FileURL: "${file.url_name}" }. This can also include any available Assembly variables. Here you can find a list of available headers.
host String "s3.amazonaws.com" The host of the storage service used. This only needs to be set when the storage service used is not Amazon S3, but has a compatible API (such as hosteurope.de).
no_vhost Boolean false Set to true if you use a custom host and run into access denied errors.
sign_urls_for Integer not set This parameter provides signed urls in the result JSON (in the signed_url and signed_ssl_url properties). The number that you set this parameter to is the url expiry time in seconds. If this parameter is not used, no url signing is done.

Live demos

Our /s3/store Robot can be used in combination with other Robots, to create powerful workflows unique to your use case.
Here are a few example scenarios, and the required Assembly Instructions to implement them.
You can also try demos of these examples right here, live on our website.

Copy many files from FTP into an Amazon S3 bucket

{
  "steps": {
    "imported": {
      "robot": "/ftp/import",
      "result": true,
      "credentials": "YOUR_CREDENTIALS",
      "path": "my_folder/desert.jpg"
    },
    "exported": {
      "use": [":original"],
      "robot": "/s3/store",
      "result": true,
      "credentials": "demo_s3_credentials"
    }
  }
}

Encode a video, extract 8 thumbnails and store everything in an S3 bucket

{
  "steps": {
    ":original": {
      "robot": "/upload/handle"
    },
    "ipad_encoded": {
      "use": [":original"],
      "robot": "/video/encode",
      "ffmpeg_stack": "v2.2.3",
      "preset": "ipad-high"
    },
    "thumbnailed": {
      "use": ["ipad_encoded"],
      "robot": "/video/thumbs",
      "result": true,
      "ffmpeg_stack": "v2.2.3"
    },
    "exported": {
      "use": [":original", "ipad_encoded", "thumbnailed"],
      "robot": "/s3/store",
      "credentials": "demo_s3_credentials"
    }
  }
}

Store uploaded files in an Amazon S3 bucket

{
  "steps": {
    ":original": {
      "robot": "/upload/handle"
    },
    "exported": {
      "use": [":original"],
      "robot": "/s3/store",
      "credentials": "demo_s3_credentials"
    }
  }
}

Blog posts about our /s3/store Robot

Over the years we wrote the following posts about our /s3/store Robot on our blog:

Did you know?

You can easily combine Robots to create powerful workflows, unique to your business.
This is the power of Transloadit.