# Working with the filesystem in Node.js

You'll commonly need to work with files in a workflow, for example: downloading content from some service to upload to another. This doc explains how to work with files in Pipedream workflows and provides some sample code for common operations.

# The /tmp directory

Within a workflow, you have full read-write access to the /tmp directory. You have 512 MB of available space in /tmp to save any file.

Data in /tmp is cleared after your workflow runs. Subsequent runs of a workflow will not have access to data saved in previous executions. For high-volume workflows, data may get retained across workflow executions, but you should never expect to have access to these files outside of the current workflow run.

# Writing a file to /tmp

Use the fs module to write data to /tmp:

const fs = require("fs");

fs.writeFileSync(`/tmp/myfile`, Buffer.from("hello, world"));

# Listing files in /tmp

This code sample uses step exports to return a list of the files saved in /tmp that you can use in future steps of your workflow:

const fs = require("fs");

this.tmpFiles = fs.readdirSync("/tmp");

# Reading a file from /tmp

This code sample uses step exports to return the contents of a test file saved in /tmp, returned as a string (fs.readFileSync returns a Buffer)

const fs = require("fs");

this.fileData = fs.readFileSync(`/tmp/myfile`).toString();

# Download a file, uploading it in another multipart/form-data request

This workflow provides an example of how to download a file at a specified Download URL, uploading that file to an Upload URL as form data.

# Download email attachments to /tmp, upload to Amazon S3

This workflow is triggered by incoming emails. When copied, you'll get a workflow-specific email address you can send any email to. This workflow takes any attachments included with inbound emails, saves them to /tmp, and uploads them to Amazon S3.

You should also be aware of the inbound payload limits associated with the email trigger.