Databricks

Databricks dashboard builder:
lakehouse to live apps.

Connect camelAI to your Databricks lakehouse and turn notebooks, Delta tables, and ML models into live applications your whole team can use. No separate BI stack. No infrastructure.

Your notebooks deserve an audience.

The gap between a data scientist's notebook and a stakeholder-ready app is enormous. camelAI bridges it in one conversation.

analysis_notebook.py
[1]:
df = spark.sql("""
  SELECT date,
         SUM(revenue) AS total
  FROM catalog.sales.daily
  GROUP BY date
""")
Out [1]:
datetotal
02026-03-14$48,291
12026-03-15$52,104
22026-03-16$45,887
... 362 more rows
camelAI
revenue-dashboard--acme.camelai.app

Total Revenue

$1.47M

+12.3%

Avg Daily

$52.1K

+8.7%

Best Day

$78.4K

Mar 8

Daily Revenue (Last 14 Days)

Live -- auto-refreshes every hourPublished

Every layer of your lakehouse. One connection.

camelAI connects to your Databricks SQL warehouse and has full access to your Delta Lake medallion architecture.

Bronze

Raw ingested data

raw_events
raw_transactions
raw_logs

Silver

Cleaned & enriched

clean_events
user_sessions
enriched_txns

Gold

Business-ready aggregates

daily_revenue
user_segments
model_features

Delta Lake

Lakehouse Storage

Bronze, silver, and gold tables. Petabytes of structured and unstructured data in your lakehouse.

Databricks

Unified Analytics

Spark SQL, Unity Catalog, MLflow, and notebooks. Your compute and governance layer.

camelAI

AI Agent

Connects to Databricks via SQL warehouse or cluster. Writes Spark SQL, builds apps, deploys instantly.

Live Apps

Published at *.camelai.app

Dashboards, model monitors, data catalogs — live at a URL your whole team can use.

Models in production need dashboards too.

Build ML observability dashboards from your MLflow data. Track accuracy, drift, and feature importance — share with stakeholders via a live URL.

churn_predictor_v3

XGBoost -- Production

MLflow Run #847

Prediction Drift

0.012

Feature Drift

0.087

Data Quality

99.2%

Latency p99

142ms

Model Accuracy (12 months)

0.9510.928

Feature Importance

transaction_amount
34%
user_tenure_days
22%
session_duration
18%
page_views_7d
14%
email_open_rate
8%
support_tickets
4%
MLflow integrationDrift detectionFeature importanceModel versioningA/B test trackingAutomated alerts

Spark SQL. Natively.

[42]:
Connected
-- Unity Catalog + Delta Lake + window functions
WITH daily_metrics AS (
  SELECT
    date,
    model_version,
    AVG(prediction_accuracy) AS avg_accuracy,
    COUNT(*) AS predictions,
    LAG(AVG(prediction_accuracy), 1) OVER (
      PARTITION BY model_version
      ORDER BY date
    ) AS prev_accuracy
  FROM ml_catalog.production.predictions
  WHERE date >= '2026-01-01'
  GROUP BY date, model_version
)
SELECT *,
  ROUND(avg_accuracy - prev_accuracy, 4) AS accuracy_delta
FROM daily_metrics
WHERE avg_accuracy < prev_accuracy
ORDER BY accuracy_delta;
Out [42]:DataFrame
datemodel_versionavg_accuracypredictionsaccuracy_delta
2026-02-18v3.20.928414,291-0.0068
2026-03-01v3.20.931215,447-0.0041
2026-03-09v3.30.945116,102-0.0023
3 rows with accuracy decline detected

camelAI writes Spark SQL natively — window functions, CTEs, Delta operations, Unity Catalog three-part names, and UDFs. If your SQL warehouse can run it, camelAI can write it.

Weeks of dashboarding, or one conversation.

Traditional BI on Databricks

  • BI tool license

    Tableau or Power BI — $70–150/user/month

  • Data modeling layer

    dbt project setup — 2–6 weeks of engineering

  • Dashboard development

    Weeks of iteration with the BI team

  • ML model visibility

    Custom monitoring — separate project entirely

Total time to first dashboard

4–12 weeks

+ ongoing BI tool costs

camelAI

  • One Databricks connection

    SQL warehouse endpoint + token — 2 minutes

  • One conversation

    Describe what you need — dashboards, monitors, catalogs

  • Published in minutes

    Live app at a shareable URL with auto-refresh

  • ML + analytics — same tool

    Model monitoring and business dashboards in one place

Total time to first dashboard

Minutes

Pay for what you use

Enterprise-ready. Lakehouse-native.

camelAI works within your existing Databricks governance and security model. No new attack surface.

Unity Catalog native

camelAI respects your Unity Catalog permissions. Row-level security, column masking, and data lineage — all honored automatically.

Private Link support

Connect via Azure Private Link or AWS PrivateLink. Your lakehouse traffic never touches the public internet.

Token-based auth

Authenticate with personal access tokens or service principals. Credentials are encrypted at rest and never logged.

Query audit trail

Every query camelAI runs is logged with identity, timestamp, warehouse, and compute cost. Full observability.

Team workspaces

Role-based access for your data team. Control who can query, build, publish, or manage connections.

Warehouse-aware

camelAI auto-scales with your SQL warehouse. Serverless, pro, or classic — it adapts to your compute configuration.

What will you build from your lakehouse?

Connect to our Databricks lakehouse and build an ML model monitoring dashboard. Pull metrics from MLflow — show accuracy, F1 score, and prediction drift over the last 30 days. Add alerts when drift exceeds thresholds.

Try this prompt

Query our Unity Catalog and build a data catalog browser. Show all schemas, tables, column descriptions, and data lineage. Make it searchable and publish it for the whole data team.

Try this prompt

Build a Spark job cost tracker from our Databricks SQL warehouse. Show compute costs by team, job duration trends, and flag any queries over $25. Set up a daily cron to email the summary.

Try this prompt

Create a feature store explorer that connects to our Databricks feature tables. Show feature distributions, freshness, and which models consume each feature. Interactive drill-downs.

Try this prompt

Works with your Databricks stack

Databricks SQLUnity CatalogDelta LakeMLflowSpark SQLDelta SharingAzure DatabricksAWS DatabricksGCP DatabricksDatabricks Connect

Your lakehouse. Live.

One conversation away.