Supermetrics for Databricks
Connect Salesforce to Databricks — CRM Intelligence on the Lakehouse


Load Salesforce opportunity, lead, and campaign data into Databricks for pipeline and revenue analytics.
Why Connect Salesforce to Databricks?
Warehouse your Salesforce data in Databricks for unlimited historical analysis and cross-source SQL.
ML deal close prediction with Feature Store
Compute features like deal velocity, touchpoint count, and stage duration from your Salesforce Delta table, register them in the Databricks Feature Store, and train a close prediction model with MLflow — same features for both training and real-time scoring.
Delta Lake versioning for pipeline auditing
Every Salesforce data load creates a new Delta version. Use DESCRIBE HISTORY to see when deals changed stage, and VERSION AS OF to reconstruct last quarter's pipeline exactly as it was — a complete audit trail without manual snapshots.
Unity Catalog governance for CRM data
Salesforce contains sensitive deal values, contact information, and pricing data. Unity Catalog lets you apply column masking, row filters per territory, and full data lineage from ingestion through to dashboards — enterprise governance built in.
How to Connect Salesforce to Databricks
Three steps. Under two minutes. Zero code.
- 1
Create a data transfer
Log into Supermetrics, select your data source and Databricks as your destination.
- 2
Authorize and configure
Connect your data source account, provide your Databricks workspace URL and access token, choose your catalog and schema, and select the data you want to transfer.
- 3
Set schedule and start transfer
Choose your refresh frequency (hourly, daily, or weekly) and click Start. Your data begins flowing into Databricks Delta tables automatically.
Salesforce Data Schema in Databricks
Supermetrics creates and maintains clean, typed tables automatically. Here's what your Salesforce data looks like in Databricks.
Data Freshness & Scheduling
Salesforce data is typically available in Databricks within 3-6 hours of the sync schedule.
What Salesforce Data Can You Pull into Databricks?
Supermetrics gives Databricks access to your full Salesforce reporting data — metrics and dimensions you already know from the Salesforce interface.
Key Metrics
- Opportunity count
- Amount won
- Hit rate
- Average amount per opportunity
- Age (days)
- Leads count
- Amount
- Expected revenue
- Probability
Key Dimensions
- Opportunity stage
- Account name
- Account industry
- Owner
- Lead source
- Product name
- Close date
- Campaign name
- Case type
- Case priority
Resources & Guides
Why Supermetrics for Databricks?
Purpose-built for marketing data since 2009. 200,000+ companies trust Supermetrics to move 15% of global ad spend into reporting and analytics destinations.
No Vendor Lock-In
Your data lands in Databricks — infrastructure you own and control. Use any BI tool, any transformation layer, any ML platform. If you ever switch providers, your data and dashboards stay with you.
170+ Marketing Data Sources
Purpose-built for marketing data — not a generic ETL tool. Supermetrics covers 99% of metrics and dimensions from each source, with pre-structured tables ready for analysis. No transformation layer required.
Incremental Loading
Only new and updated Salesforce records are transferred on each run — efficient, cost-effective, and fast. Full historical backfill available on demand.
Enterprise-Grade Security
SOC 2 Type II certified. GDPR and CCPA compliant. OAuth authentication with encrypted credentials. Regional data hosting available. Your data is protected end-to-end.
Flat-Rate, Predictable Pricing
Fixed annual pricing regardless of data volume — no per-row charges, no surprise bills during peak campaign seasons. Transfer as much Salesforce data as you need without worrying about cost spikes.
Complete Data Access
Pull every contact, deal, campaign, and custom property from Salesforce. No field restrictions, no record limits — your complete dataset, ready for analysis.
Frequently Asked Questions
How do I connect Salesforce to Databricks with Supermetrics?
Log into the Supermetrics Hub, create a new data transfer, select Salesforce as the source and Databricks as the destination. Authorize your Salesforce account, provide your Databricks workspace URL and access token, choose your catalog, schema, and Unity Catalog settings, select the fields you need, set a schedule, and start the transfer. No custom notebooks, Spark jobs, or Delta Lake plumbing required — Supermetrics writes directly to Delta tables and registers them in Unity Catalog so your data is governed, versioned, and queryable with both SQL and PySpark from the moment it lands.
Is my Salesforce data secure when transferring to Databricks?
Supermetrics is SOC 2 Type II certified and fully GDPR compliant. All Salesforce credentials are encrypted at rest and in transit. Data flows directly from the Salesforce API into your Databricks workspace — Supermetrics never stores your marketing data on its own servers. Unity Catalog provides centralized governance: fine-grained row-level and column-level security, attribute-based access control, and a full audit log of who queried what. Delta Lake's transaction log makes every write atomic and traceable, so you always have a verifiable lineage of your Salesforce data from ingestion to insight.
Can I combine Salesforce data with other sources in Databricks?
That is one of the defining advantages of the Databricks lakehouse architecture. Once Salesforce data lands as a Delta table, you can JOIN it with any other table in your lakehouse — raw event streams, CRM exports, product analytics, even ML Feature Store tables used for model training. Query in SQL from Databricks SQL warehouses or switch to PySpark and pandas for data science workflows — same data, no copying. Supermetrics supports 170+ connectors that all land in the same Unity Catalog namespace, and the Photon engine accelerates analytical queries on those Delta tables automatically.
What Salesforce metrics and dimensions are available in Databricks?
All standard Salesforce reporting fields are available, including Opportunity count, Amount won, Hit rate, Average amount per opportunity, Age (days), Leads count, and many more. You select exactly which metrics and dimensions to transfer during setup, and you can add or remove fields at any time without losing historical data already stored in your Delta tables. Delta Lake's time travel lets you query any previous version of your Salesforce data — useful for auditing retroactive metric recalculations or reproducing a dashboard state from last quarter. Schema evolution is handled automatically, so new fields appear as columns without breaking existing queries.
How fresh is Salesforce data in Databricks?
Data freshness depends on your transfer schedule. Supermetrics supports hourly, daily, or weekly transfers into Databricks. Most teams schedule daily transfers so yesterday's complete data is available each morning. Delta Lake's MERGE capability ensures only new and changed records are upserted, keeping cluster utilization and storage costs low. For teams that need near-real-time visibility, the Photon engine accelerates incremental queries so dashboards refresh in seconds, and you can set up Databricks SQL alerts to trigger notifications when key Salesforce metrics cross your thresholds.
Which Salesforce objects are supported?
The connector supports opportunities, leads, accounts, contacts, campaigns, cases, and custom objects. Standard and custom fields are available.
Can I track pipeline changes over time in Databricks?
Yes. Incremental loads capture deal stage changes, so you can build pipeline progression and stage duration analyses with SQL.
Also Connect to Databricks
Ready to Connect Salesforce to Databricks?
Join 200,000+ companies that use Supermetrics to connect their marketing data. Set up in under two minutes.


