-
Notifications
You must be signed in to change notification settings - Fork 2
Use Case – Reset Data in Sandbox Org
ubraig edited this page Jul 12, 2023
·
1 revision
In this advanced example scenario, we want to reset a training environment in a Salesforce sandbox org:
- Delete all existing data by exporting and hard-deleting a set of Objects in a specific order.
- Upload / initial load fresh data from a set of .csv files with well-defined test/trainig data.
Prerequisites
- A key file
MyKeyfile.key
and the password + security token for the user have been prepared with theNew-SfEncryptionKeyFile
andConvertTo-SfEncryptedString
commands. - Based on this, we can generate an authentication token for the target org in the $MyOrg variable via
Get-SfAuthToken
- The user has assigned the 'Bulk API Hard Delete' permission.
- The sequence of object names in the delete job is optimized to e.g. delete child records befor its master records.
We will:
- Prepare the Auth Token from the encrypted password.
- Created the list of object names in the desired order of execution.
- Loop through the list to get all the record ids.
- Loop through the list to run the hard-delete job.
The corresponding script would look like:
$MyEncryptedPassword = 'EncryptionResultFromPreviousStep'
$MyOrg = Get-SfAuthToken -Username [email protected] -EncryptedString $MyEncryptedPassword -KeyFile MyKeyfile.key -InstanceUrl https://test.salesforce.com
$MyObjectsToDeleteList = @(
'CampaignMember',
'Campaign',
'Lead',
'Contact',
'Account'
)
foreach($Object in $MyObjectsToDeleteList){
Export-SfRecords $MyOrg $Object -Bulk Serial
Remove-SfRecords $MyOrg $Object -Bulk SerialHardDelete
}
Notes on implicit behaviour involved here:
- On the export, we do not provide a SOQL statement. So it will default to
SELECT Id FROM <ObjectName>
. - Neither on the export nor on the remove operation we do provide a path to a .csv file. It will always default to
<ObjectName>.csv
in the current directory. - For the remove operation we do not provide a .sdl mapping file. So it will auto-create one from the column headers in the .csv file, which will result to a .sdl mapping file named
<ObjectName>.sdl
with one entry each:Id=Id
.
Assumptions:
- Authentication as described above.
- We will have a set of .csv files named by the object name:
Account.csv
Contact.csv
Lead.csv
etc. Each .csv file will follow these conventions:- An ExternalId__c custom field to be used in the upsert operation.
- The .csv column names exactly match the field API names, so auto-creation of the mapping file will work.
- Relationship lookups between records, e.g. Contact-to-Account assignments will use the proper field name syntax and refer to the ExternalId__c value of the respective target record.
The corresponding script would look like (Short syntax and command aliases are used for illustration here):
$MyEncryptedPassword = 'EncryptionResultFromPreviousStep'
$MyOrg = sfauth [email protected] $MyEncryptedPassword -KeyFile MyKeyfile.key -InstanceUrl https://test.salesforce.com
$MyObjectsToUpsertList = @(
'Account',
'Contact',
'Lead',
'Campaign'
)
foreach($Object in $MyObjectsToUpsertList){
sfupsert $MyOrg $Object -ExternalIdField ExternalId__c -Bulk Serial
}
Notes on implicit behaviour involved here:
- We do not provide a path to a .csv file. It will always default to
<ObjectName>.csv
in the current directory. - We do not provide a .sdl mapping file. So it will auto-create one from the column headers in the .csv file, which will result to a .sdl mapping file named
<ObjectName>.sdl
that 1:1 maps all fields as provided by the column headers in the .csv files.