The /s3/import Robot

We are happy to import from whatever storage solution suits you best.

The /s3/import Robot imports whole directories of files from your S3 bucket.

Parameters

Name Type Default Description
credentials (required) String Please create your associated Template Credentials in your Transloadit account and use the name of your Template Credentials as this parameter's value. They will contain the values for your S3 bucket, key, secret and bucket region. While we recommend to use Template Credentials at all times, some use cases demand dynamic credentials for which using Template Credentials with their static nature is too unwieldy. If you have this requirement, feel free to use the following parameters instead: "bucket", "bucket_region" (for example: "us-east-1" or "eu-west-2"), "key", "secret".
path (required) String / Array of Strings The path in your bucket to the specific file or directory. If the path points to a file, only this file will be imported. For example: js/my_javascript_file.js If it points to a directory, then all files from this directory will be imported. For example: js/ For Transloadit to recognize you want to import all files from a directory, make sure your path ends with a / slash. Directories are not imported recursively. If you want to import files from subfolders and sub-subfolders, please do this with another Robot Step. If you want to import all files from the root directory, please use / as the value here. In this case, make sure all your objects belong to a path. If you have objects in the root of your bucket that aren't prefixed with /, you'll receive an error: A client error (NoSuchKey) occurred when calling the GetObject operation: The specified key does not exist. You can also use an array of path strings here to import multiple paths in the same Robot Step.
recursive Boolean false Setting this true will enable importing files from subfolders and sub-subdfolders, etc. of the given path. Please use the pagination parameters page_number and files_per_page wisely here.
page_number Number 1 The pagination page number. To not break backwards compatibility in non-recursive imports, this for now only works when recursive is to true. When doing big imports, make sure no files are added or removed from other scripts within your path, otherwise you might get weird results with the pagination.
files_per_page Number 1000 The pagination page size. this only works when recursive is to true for now in order to not break backwards compatibility in non-recursive imports.
ignore_errors Boolean false There might be an error coming up when trying to extract meta data from your imported files. This happens for files that are zero bytes big for example. Setting this to true will cause the Robot to not stop the import (and the entire Assembly) when that happens.

Live demos

Our /s3/import Robot can be used in combination with other Robots, to create powerful workflows unique to your use case.
Here are a few example scenarios, and the required Assembly Instructions to implement them.
You can also try demos of these examples right here, live on our website.

Import a specific file from S3

{
  "steps": {
    "imported": {
      "robot": "/s3/import",
      "result": true,
      "credentials": "demo_s3_credentials",
      "path": "my_folder/desert.jpg"
    },
    "resized": {
      "use": ["imported"],
      "robot": "/image/resize",
      "result": true,
      "height": 130,
      "imagemagick_stack": "v2.0.3",
      "width": 130,
      "zoom": false
    },
    "exported": {
      "use": ["imported", "resized"],
      "robot": "/s3/store",
      "credentials": "demo_s3_credentials"
    }
  }
}

Import an entire folder of files from S3

{
  "steps": {
    "imported": {
      "robot": "/s3/import",
      "result": true,
      "credentials": "demo_s3_credentials",
      "path": "my_folder/"
    },
    "resized": {
      "use": ["imported"],
      "robot": "/image/resize",
      "result": true,
      "height": 130,
      "imagemagick_stack": "v2.0.3",
      "width": 130,
      "zoom": false
    },
    "exported": {
      "use": ["imported", "resized"],
      "robot": "/s3/store",
      "credentials": "demo_s3_credentials"
    }
  }
}

Resize all images in an S3 bucket

{
  "steps": {
    "imported": {
      "robot": "/s3/import",
      "result": true,
      "credentials": "demo_s3_credentials",
      "path": "my_folder/"
    },
    "resized": {
      "use": ["imported"],
      "robot": "/image/resize",
      "result": true,
      "height": 130,
      "imagemagick_stack": "v2.0.3",
      "width": 130,
      "zoom": false
    },
    "exported": {
      "use": ["resized"],
      "robot": "/s3/store",
      "result": true,
      "credentials": "demo_s3_credentials",
      "path": "${file.original_path}-resized@130/${file.basename}.${file.ext}"
    }
  }
}

Blog posts about our /s3/import Robot

Over the years we wrote the following posts about our /s3/import Robot on our blog:

Did you know?

You can easily combine Robots to create powerful workflows, unique to your business.
This is the power of Transloadit.