Salesforce Big Objects Guide

Himanshu Varshney
Senior Salesforce Developer
January 26, 2024

BigObjects

Big Objects Best Practices

When working with Big Objects, it's important to adhere to best practices to ensure optimal performance and scalability:

Define Clear Use Cases: Identify specific use cases for Big Objects, focusing on large-scale data storage needs that don't require real-time access.

Data Archiving: Use Big Objects for archiving historical data that needs to be accessible but not frequently modified.

Performance Considerations: Design queries on Big Objects to be specific and narrow to minimize performance impacts, using indexed fields whenever possible.

Data Management: Regularly review and clean up unnecessary data to maintain system performance and reduce storage costs.

Integration Strategies: When integrating external systems, consider asynchronous data processing to reduce the load on Salesforce.


Define and Deploy Custom Big Objects

To create a Custom Big Object, you define its structure through Metadata API by specifying the fields and indexes. The process involves creating a metadata file that describes the Big Object, including its API name, fields, and indexes, and deploying this file to Salesforce using tools like the Salesforce CLI or Workbench.


Deploying and Retrieving Metadata with the Zip File

To deploy or retrieve Big Object metadata, you can use the Metadata API's deploy and retrieve operations with a ZIP file. This ZIP file contains the metadata descriptions of the Big Objects, including field definitions and index configurations. The Salesforce CLI or Workbench can be used to perform these operations.


Populate a Custom Big Object

Populating a Custom Big Object can be done through various methods:

Batch Apex: Suitable for large-scale data migrations or integrations.

External Data Integration Tools: Tools like Salesforce Data Loader or external ETL tools can be used for data import.

APIs: Salesforce APIs (e.g., Bulk API) can insert large volumes of data efficiently.


Populate a Custom Big Object with Apex

You can use Apex to programmatically insert data into a Big Object. This is particularly useful for complex data processing or when data originates within Salesforce. Apex code must be bulkified and optimized to handle large data volumes.


Delete Data in a Custom Big Object

Deleting data from Big Objects is not as straightforward as with standard or custom objects. You must use the deleteImmediate() method provided by Salesforce, which allows for the deletion of specific records based on defined criteria. This operation must be carefully managed due to its potential impact on system performance.


Big Objects Queueable Example

Queueable Apex can be used to perform asynchronous operations on Big Objects, such as inserting or processing large volumes of data. This approach helps in managing resource consumption and improving performance by leveraging asynchronous execution.

public class BigObjectQueueable implements Queueable {
    public void execute(QueueableContext context) {
        // Your logic to handle Big Object operations
    }
}


Big Object Query Examples

Querying Big Objects requires the use of SOQL, with some limitations compared to standard SOQL queries. For example, you must query only indexed fields and cannot use wildcard characters in SELECT statements.

SELECT Field1__c, Field2__c FROM CustomBigObject__b WHERE IndexedField__c = 'Value'


View Big Object Data in Reports and Dashboards

Currently, directly accessing Big Object data in standard Salesforce reports and dashboards is not supported. To visualize this data, you may need to use custom solutions, such as Lightning Components or external reporting tools that can access Big Object data through APIs.


SOQL with Big Objects

SOQL queries on Big Objects have specific considerations:

Indexed Fields: Queries must filter on an indexed field and can include additional filters on non-indexed fields.

Limitations: Functions like ORDER BY, GROUP BY, and LIKE are not supported.

Aggregate Queries: Aggregate queries are supported but with limitations, such as needing to include an indexed field in the GROUP BY clause.

By understanding and utilizing these aspects of Big Objects, organizations can effectively manage large volumes of data within Salesforce, ensuring data accessibility without compromising system performance.

Share this article:
View all articles

Related Articles

Choosing the Right Data Sources for Training AI Chatbots featured image
December 12, 2025
If your AI chatbot sounds generic, gives wrong answers, or feels unreliable, the problem is probably not the model. It is the data behind it. In this article, you will see why choosing the right data sources matters more than any tool or framework. We walk through what data your chatbot should actually learn from, which sources help it sound accurate and confident, which ones quietly break performance, and how to use your existing knowledge without creating constant maintenance work. If you want a chatbot that truly reflects how your business works, this is where you need to start.
Lead Qualification Made Easy with AI Voice Assistants featured image
December 11, 2025
If your sales team is spending hours chasing leads that never convert, this is for you. Most businesses do not have a lead problem, they have a qualification problem. In this article, you will see how AI voice assistants handle the first conversation, ask the right questions, and surface only the leads worth your team’s time. You will learn how voice AI actually works, where it fits into real sales workflows, and why companies using it respond faster, close more deals, and stop wasting effort on unqualified prospects. If you want your leads filtered before they ever reach sales, keep reading.
The Automation Impact on Response Time and Conversions Is Bigger Than Most Businesses Realize featured image
December 9, 2025
This blog explains how response time has become one of the strongest predictors of conversions and why most businesses lose revenue not from poor marketing, but from slow follow up. It highlights how automation eliminates the delays that humans cannot avoid, ensuring immediate engagement across chat, voice, and form submissions. The post shows how automated systems capture intent at its peak, create consistent customer experiences, and significantly increase conversion rates by closing the gap between inquiry and response. Automation does not just improve speed. It transforms how the entire pipeline operates.

Unlock the Full Power of AI-Driven Transformation

Schedule a Demo

See how Anablock can automate and scale your business with AI.

Book Now

Start a Voice Call

Talk directly with our AI experts and get real-time guidance.

Call Now

Send us a Message

Summarize this page content with AI