This project is an integration between a Salesforce Org, Salesforce Functions and Amazon S3.
The project is complementary with this node app that allows Salesforce users to download Amazon S3 documents.
The goal of the integration is to export documents from Salesforce to S3 to reduce file storage consumption on the Salesforce side.
Thanks to Functions, we transfer documents to S3 with the following scenario:
- User uploads and attaches a document to a record in Salesforce.
- Apex trigger kicks in after the document is saved and calls a handler class with the document metadata (not the document binary content to avoid Apex limits).
- Apex trigger handler class invokes a Salesforce Function asynchronously with the document metadata.
- Function retrieves the document content using the Salesforce REST API.
- Function uploads the document content to an Amazon S3 bucket.
- Once the document is uploaded, the function creates a Salesforce record that links the document stored in S3 to the record and it calls an Apex callback method on the trigger handler class.
- Apex callback method removes the original document from Salesforce.
- Sign up for a Salesforce Functions trial org.
- Enable Dev Hub in your org.
- Install the Salesforce CLI.
- Authorize your Dev Hub in the Salesforce CLI.
- Sign up for a AWS free-tier account.
- Create a S3 bucket.
- Complete these steps in the Identity and Access Management (IAM) console:
- Create a policy that grants write access on your S3 bucket.
- Create a user.
- Assign your policy to the user.
-
Install the project in a Scratch org by running this script:
MacOS or Linux
./install-dev.sh
Windows
install-dev.bat
You can install the project on other types of Salesforce orgs by looking at the content of the scripts and changing the commands.
-
For each Object that you would like to export document for (Account in this example), create a record for the custom metadata type "S3 Document Setting"
-
Navigate to Custom Code > Custom Metadata Types in Salesforce Setup
-
Click Manage Records for "S3 Document Setting"
-
Click New
-
Assuming that we want to work with the Account object, enter those values then click Save:
- Label:
S3 Account Document
- S3 Document Setting Name:
S3_Account_Document
- Object API Name:
Account
- Label:
-
-
For each Object that you would like to export document for, create a junction object between
S3_Document__c
and your object (Account based on the previous example)-
Navigate to Object Manager in Salesforce Setup
-
Click Create and select Custom Object
-
Enter those values then click Save:
- Label:
S3 Account Document
- Plural Label:
S3 Account Documents
- Object Name:
S3_Account_Document__c
- Record Name:
S3 Account Document ID
- Data Type:
Auto Number
- Display Format:
S3-ACC-DOC-{0000}
- Starting Number:
0
- Label:
Note: The object name is automatically selected by the Function so naming must follow this convention:
S3_OBJECT_Document__c
whereOBJECT
is the API name of the object without the trailing__c
for custom objects. For example, if you have aMy_Custom_Object__c
object, you should enterS3_My_Custom_Object_Document__c
. -
-
Optional: for each Object that you would like to export document for, configure related list layout to display relevant fields
You have two options for this step: deploy to a compute environement or run locally. Make sure to refer to the relevant section and check the environment variables reference section for the appropriate configuration.
Follow these steps to deploy your function to a compute environment:
-
Log in to Salesforce Functions (you may have to repeat this command later as this will eventually time out)
sf login functions
-
Create a compute environment:
sf env create compute --alias s3env --connected-org s3
-
Deploy the Salesforce Function:
cd functions/s3import sf deploy functions -o s3
-
Configure the Salesforce Function with the following command (see environment variables reference):
sf env var set AWS_ACCESS_KEY_ID=XXXXXXXXXX -e s3env sf env var set AWS_SECRET_ACCESS_KEY=XXXXXXXXXX -e s3env sf env var set AWS_REGION=XXXXXXXXXX -e s3env sf env var set AWS_S3_BUCKET=XXXXXXXXXX -e s3env sf env var set DOWNLOAD_URL_PREFIX='XXXXXXXXXX' -e s3env
Follow these steps to test your function locally:
-
Create a
.env
file in thefunctions/s3import
directory. Use the following template and make sure to replace values accordingly (see environment variables reference):AWS_ACCESS_KEY_ID=XXXXXXXXXX AWS_SECRET_ACCESS_KEY=XXXXXXXXXX AWS_REGION=XXXXXXXXXX AWS_S3_BUCKET=XXXXXXXXXX DOWNLOAD_URL_PREFIX=XXXXXXXXXX
-
Run these commands to start the function locally:
cd functions/s3import sf run function start
Variable Name | Description | Example |
---|---|---|
AWS_ACCESS_KEY_ID |
The access key ID for your AWS IAM user. | secret |
AWS_SECRET_ACCESS_KEY |
The secret access key for your AWS IAM user. | secret |
AWS_REGION |
The region of your S3 bucket. | eu-west-3 |
AWS_S3_BUCKET |
The name of your S3 bucket. | poz-sf-demo |
DOWNLOAD_URL_PREFIX |
An optional prefix appended in front of the S3 download URL. This is useful for redirecting users to a proxy that checks Salesforce auth before downloading the file. | https://my-proxy.herokuapp.com/download?url= |
Monitor Salesforce Function's logs by running:
sf env log tail -e s3env
Monitor Salesforce logs by running:
sfdx force:apex:log:tail -c