Build generative AI–powered Salesforce applications with Amazon Bedrock


This post is co-authored by Daryl Martis and Darvish Shadravan from Salesforce.

This is the fourth post in a series discussing the integration of Salesforce Data Cloud and Amazon SageMaker.

In Part 1 and Part 2, we show how Salesforce Data Cloud and Einstein Studio integration with SageMaker allows businesses to access their Salesforce data securely using SageMaker’s tools to build, train, and deploy models to endpoints hosted on SageMaker. SageMaker endpoints can be registered with Salesforce Data Cloud to activate predictions in Salesforce. In Part 3, we demonstrate how business analysts and citizen data scientists can create machine learning (ML) models, without code, in Amazon SageMaker Canvas and deploy trained models for integration with Salesforce Einstein Studio to create powerful business applications.

In this post, we show how native integrations between Salesforce and Amazon Web Services (AWS) enable you to Bring Your Own Large Language Models (BYO LLMs) from your AWS account to power generative artificial intelligence (AI) applications in Salesforce. Requests and responses between Salesforce and Amazon Bedrock pass through the Einstein Trust Layer, which promotes responsible AI use across Salesforce.

We demonstrate BYO LLM integration by using Anthropic’s Claude model on Amazon Bedrock to summarize a list of open service cases and opportunities on an account record page, as shown in the following figure.

Partner quote

“We continue to expand on our strong collaboration with AWS with our BYO LLM integration with Amazon Bedrock, empowering our customers with more model choices and allowing them to create AI-powered features and Copilots customized for their specific business needs. Our open and flexible AI environment, grounded with customer data, positions us well to be leaders in AI-driven solutions in the CRM space.”

–Kaushal Kurapati, Senior Vice President of Product for AI at Salesforce

Amazon Bedrock

Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can quickly experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don’t have to manage infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.

Salesforce Data Cloud and Einstein Model Builder

Salesforce Data Cloud is a data platform that unifies your company’s data, giving every team a 360-degree view of the customer to drive automation and analytics, personalize engagement, and power trusted AI. Data Cloud creates a holistic customer view by turning volumes of disconnected data into a single, trusted model that’s simple to access and understand. With data harmonized within Salesforce Data Cloud, customers can put their data to work to build predictions and generative AI–powered business processes across sales, support, and marketing.

With Einstein Model Builder, customers can build their own models using Salesforce’s low-code model builder experience or integrate their own custom-built models into the Salesforce platform. Einstein Model Builder’s BYO LLM experience provides the capability to register custom generative AI models from external environments such as Amazon Bedrock and Salesforce Data Cloud.

Once custom Amazon Bedrock models are registered in Einstein Model Builder, models are connected through the Einstein Trust Layer, a robust set of features and guardrails that protect the privacy and security of data, improve the safety and accuracy of AI results, and promote the responsible use of AI across Salesforce. Registered models can then be used in Prompt Builder, a newly launched, low-code prompt engineering tool that allows Salesforce admins to build, test, and fine-tune trusted AI prompts that can be used across the Salesforce platform. These prompts can be integrated with Salesforce capabilities such as Flows and Invocable Actions and Apex.

Solution overview

With the Salesforce Einstein Model Builder BYO LLM feature, you can invoke Amazon Bedrock models in your AWS account. At the time of this writing, Salesforce supports Anthropic Claude 3 models on Amazon Bedrock for BYO LLM. For this post, we use the Anthropic Claude 3 Sonnet model. To learn more about inference with Claude 3, refer to Anthropic Claude models in the Amazon Bedrock documentation.

For your implementation, you may use the model of your choice. Refer to Bring Your Own Large Language Model in Einstein 1 Studio for models supported with Salesforce Einstein Model Builder.

The following image shows a high-level architecture of how you can integrate the LLM from your AWS account into the Salesforce Prompt Builder.

In this post, we show how to build generative AI–powered Salesforce applications with Amazon Bedrock. The following are the high-level steps involved:

  1. Grant Amazon Bedrock invoke model permission to an AWS Identity and Access Management (IAM) user
  2. Register the Amazon Bedrock model in Salesforce Einstein Model Builder
  3. Integrate the prompt template with the field in the Lightning App Builder

Prerequisites

Before deploying this solution, make sure you meet the following prerequisites:

  1. Have access to Salesforce Data Cloud and meet the requirements for using BYO LLM.
  2. Have Amazon Bedrock set up. If this is the first time you are accessing Anthropic Claude models on Amazon Bedrock, you need to request access. You need to have sufficient permissions to request access to models through the console. To request model access, sign in to the Amazon Bedrock console and select Model access at the bottom of the left navigation pane.

Solution walkthrough

To build generative AI–powered Salesforce applications with Amazon Bedrock, implement the following steps.

Grant Amazon Bedrock invoke model permission to an IAM User

Salesforce Einstein Studio requires an access key and a secret to access the Amazon Bedrock API. Follow the instructions to set up an IAM user and access keys. The IAM user must have Amazon Bedrock invoke model permission to access the model. Complete the following steps:

  1. On the IAM console, select Users in the navigation panel. On the right side of the console, choose Add permissions and Create inline policy.
  2. On the Specify permissions screen, in the Service dropdown menu, select Bedrock.
  3. Under Actions allowed, enter “invoke.” Under Read, select InvokeModel. Select All under Resources. Choose Next.
  4. On the Review and create screen, under Policy name, enter BedrockInvokeModelPolicy. Choose Create policy.

Register Amazon Bedrock model in Einstein Model Builder

  1. On the Salesforce Data Cloud console, under the Einstein Studio tab, choose Add Foundation Model.
  2. Choose Connect to Amazon Bedrock.
  3. For Endpoint information, enter the endpoint name, your AWS account Access Key, and your Secret Key. Enter the Region and Model information. Choose Connect.
  4. Now, create the configuration for the model endpoint you created in the previous steps. Provide Inference parameters such as temperature to set the deterministic factor of the LLM. Enter a sample prompt to verify the response.
  5. Next, you can save this new model configuration. Enter the name for the saved LLM model and choose Create Model.
  6. After the model creation is successful, choose Close and proceed to create the prompt template.
  7. Select the Model name to open the Model configuration.
  8. Select Create Prompt Template to launch the prompt builder.
  9. Select Field Generation as the prompt template type, template name, set Object to Account, and set Object Field to PB Case and Oppty Summary. This will associate the template to a custom field in the account record object to summarize the cases.

For this demo, a rich text field named PB Case and Oppty Summary was created and added to the Salesforce Account page layout according to the Add a Field Generation Prompt Template to a Lightning Record Page instructions.

  1. Provide the prompt and input variables or objects for data grounding and select the model. Refer to Prompt Builder to learn more.

Integrate prompt template with the field in the Lightning App builder

  1. On the Salesforce console, use the search bar to find Lightning App Builder. Build or edit an existing page to integrate the prompt template with the field as shown in the following screenshot. Refer to Add a Field Generation Prompt Template to a Lightning Record Page for detailed instructions.
  2. Navigate to the Account page and click on the PB Case and Oppty Summary enabled for chat completion to launch the Einstein generative AI assistant and summarize the account case data.

Cleanup

Complete the following steps to clean up your resources.

  1. Delete the IAM user
  2. Delete the foundation model in Einstein Studio

Amazon Bedrock offers on-demand inference pricing. There’s no additional costs with a continued model subscription. To remove model access, refer to the steps in Remove model access.

Conclusion

In this post, we demonstrated how to use your own LLM in Amazon Bedrock to power Salesforce applications. We used summarization of open service cases on an account object as an example to showcase the implementation steps.

Amazon Bedrock is a fully managed service that makes high-performing FMs from leading AI companies and Amazon available for your use through a unified API. You can choose from a wide range of FMs to find the model that is best suited for your use case.

Salesforce Einstein Model Builder lets you register your Amazon Bedrock model and use it in Prompt Builder to create prompts grounded in your data. These prompts can then be integrated with Salesforce capabilities such as Flows and Invocable Actions and Apex. You can then build custom generative AI applications with Claude 3 that are grounded in the Salesforce user experience. Amazon Bedrock requests from Salesforce pass through the Einstein Trust Layer, which provides responsible AI use with features such as dynamic grounding, zero data retention, and toxicity detection while maintaining safety and security standards.

AWS and Salesforce are excited for our mutual customers to harness this integration and build generative AI–powered applications. To learn more and start building, refer to the following resources.


About the Authors

Daryl Martis is the Director of Product for Einstein Studio at Salesforce Data Cloud. He has over 10 years of experience in planning, building, launching, and managing world-class solutions for enterprise customers, including AI/ML and cloud solutions. He has previously worked in the financial services industry in New York City. Follow him on LinkedIn.

Darvish Shadravan is a Director of Product Management in the AI Cloud at Salesforce. He focuses on building AI/ML features for CRM, and is the product owner for the Bring Your Own LLM feature. You can connect with him on LinkedIn.

RachnaRachna Chadha is a Principal Solutions Architect AI/ML in Strategic Accounts at AWS. Rachna is an optimist who believes that ethical and responsible use of AI can improve society in the future and bring economic and social prosperity. In her spare time, Rachna likes spending time with her family, hiking, and listening to music.

Ravi Bhattiprolu is a Sr. Partner Solutions Architect at AWS. Ravi works with strategic partners Salesforce and Tableau to deliver innovative and well-architected products and solutions that help joint customers realize their business objectives.

Ife Stewart is a Principal Solutions Architect in the Strategic ISV segment at AWS. She has been engaged with Salesforce Data Cloud over the last 2 years to help build integrated customer experiences across Salesforce and AWS. Ife has over 10 years of experience in technology. She is an advocate for diversity and inclusion in the technology field.

Mike Patterson is a Senior Customer Solutions Manager in the Strategic ISV segment at AWS. He has partnered with Salesforce Data Cloud to align business objectives with innovative AWS solutions to achieve impactful customer experiences. In Mike’s spare time, he enjoys spending time with his family, sports, and outdoor activities.

Dharmendra Kumar Rai (DK Rai) is a Sr. Data Architect, Data Lake & AI/ML, serving strategic customers. He works closely with customers to understand how AWS can help them solve problems, especially in the AI/ML and analytics space. DK has many years of experience in building data-intensive solutions across a range of industry verticals, including high-tech, FinTech, insurance, and consumer-facing applications.

Recent Articles

Related Stories

Leave A Reply

Please enter your comment!
Please enter your name here