R Toolkit for 'Databricks'
Access Control Request for Group
Access Control Request For User
Access Control Request
Add Library Path
AWS Attributes
Azure Attributes
Close Databricks Workspace Connection
Cluster Autoscale
Cluster Log Configuration
Condition Task
Copy data frame to Databricks as table or view
Cron Schedule
DBI Interface for Databricks SQL Warehouses
dbplyr Backend for Databricks SQL
DBI Connection for Databricks
DBI Driver for Databricks
DBI Result for Databricks
Create Databricks SQL Driver
Append data using atomic INSERT INTO with SELECT VALUES
Assert that a statement is provided
Assert that a connection is valid
Clean table name input
Cluster Action Helper Function
Create a Cluster
Delete/Terminate a Cluster
Edit a Cluster
List Cluster Activity Events
Get Details of a Cluster
List Available Cluster Node Types
List Availability Zones (AWS Only)
List Clusters
Permanently Delete a Cluster
Pin a Cluster
Resize a Cluster
Restart a Cluster
List Available Databricks Runtime Versions
Start a Cluster
Delete/Terminate a Cluster
Unpin a Cluster
Collect query results with proper progress timing for Databricks
Cancel a Command
Parse Command Results
Run a Command and Wait For Results
Run a Command
Get Information About a Command
Create an Execution Context
Delete an Execution Context
Databricks Execution Context Manager (R6 Class)
Get Information About an Execution Context
Create table with explicit schema before inserting values
Create table from data frame structure
Detect Current Workspaces Cloud
Get Current User Info
Detect Current Workspace ID
DBFS Add Block
DBFS Close
DBFS Create
DBFS Delete
DBFS Get Status
DBFS List
DBFS mkdirs
DBFS Move
DBFS Put
DBFS Read
Escape string literals for inline SQL VALUES
Generate typed VALUES SQL for temporary views (helper)
Generate type-aware VALUES SQL from data frame
Generate VALUES SQL from data frame
Generate/Fetch Databricks Host
Create Job
Delete a Job
Get Job Details
List Jobs
Repair A Job Run
Overwrite All Settings For A Job
Trigger A New Job Run
Cancel Job Run
Delete Job Run
Export Job Run Output
Get Job Run Output
Get Job Run Details
List Job Runs
Create And Trigger A One-Time Run
Partially Update A Job
Generate Database Credential
Find Database Instance by UID
Get Database Instance
List Database Instances
Get Status of All Libraries on All Clusters
Get Status of Libraries on Cluster
Install Library on Cluster
Uninstall Library on Cluster
Approve Model Version Stage Transition Request
Delete a Model Version Stage Transition Request
Get All Open Stage Transition Requests for the Model Version
Reject Model Version Stage Transition Request
Make a Model Version Stage Transition Request
Transition a Model Version's Stage
Delete a Comment on a Model Version
Edit a Comment on a Model Version
Make a Comment on a Model Version
Get Registered Model Details
Create OAuth 2.0 Client
Perform Databricks API Request
Prepare fields for CREATE TABLE
Create a SQL Query
Delete a SQL Query
Get a SQL Query
List SQL Queries
Update a SQL Query
Read .netrc File
Remote REPL to Databricks Cluster
Create Repo
Delete Repo
Get All Repos
Get Repo
Update Repo
Propagate Databricks API Errors
Generate Request JSON
Databricks Request Helper
Delete Secret in Secret Scope
List Secrets in Secret Scope
Put Secret in Secret Scope
Delete Secret Scope ACL
Get Secret Scope ACL
List Secret Scope ACL's
Put ACL on Secret Scope
Create Secret Scope
Delete Secret Scope
List Secret Scopes
Check if volume method should be used
Create Empty Data Frame from Query Manifest
Execute SQL Query and Wait for Completion
Cancel SQL Query
Poll a Query Until Successful
Execute SQL Query
Get SQL Query Results
Get SQL Query Status
Fetch SQL Query Results (Fast Path)
Fetch SQL Query Results (Parallel Path)
Fetch SQL Query Results from Completed Query
Get Global Warehouse Config
Process Inline SQL Query Results
List Warehouse Query History
Execute query with SQL Warehouse
Create Empty R Vector from Databricks SQL Type
Create Warehouse
Delete Warehouse
Edit Warehouse
Get Warehouse
List Warehouses
Start Warehouse
Stop Warehouse
Fetch Databricks Token
Get Catalog (Unity Catalog)
List Catalogs (Unity Catalog)
Get Schema (Unity Catalog)
List Schemas (Unity Catalog)
Delete Table (Unity Catalog)
Check Table Exists (Unity Catalog)
Get Table (Unity Catalog)
List Tables (Unity Catalog)
List Table Summaries (Unity Catalog)
Update Volume (Unity Catalog)
Delete Volume (Unity Catalog)
Get Volume (Unity Catalog)
List Volumes (Unity Catalog)
Update Volume (Unity Catalog)
Volume FileSystem Delete
Volume FileSystem Create Directory
Volume FileSystem Delete Directory
Volume FileSystem Check Directory Exists
Volume FileSystem File Status
Volume FileSystem List Directory Contents
Volume FileSystem Read
Recursively delete all contents of a volume directory
Upload Directory to Volume in Parallel
Volume FileSystem Write
Create a Vector Search Endpoint
Delete a Vector Search Endpoint
Get a Vector Search Endpoint
List Vector Search Endpoints
Create a Vector Search Index
Delete Data from a Vector Search Index
Delete a Vector Search Index
Get a Vector Search Index
List Vector Search Indexes
Query Vector Search Next Page
Query a Vector Search Index
Scan a Vector Search Index
Synchronize a Vector Search Index
Upsert Data into a Vector Search Index
Delete Object/Directory (Workspaces)
Export Notebook or Directory (Workspaces)
Get Object Status (Workspaces)
Import Notebook/Directory (Workspaces)
List Directory Contents (Workspaces)
Make a Directory (Workspaces)
Write table using standard SQL approach
Write table using volume-based approach
Fetch Databricks Workspace ID
Append rows to an existing Databricks table
Append rows to an existing Databricks table (Id method)
Begin transaction (not supported)
Clear result set
Get column information from result
Commit transaction (not supported)
Connect to Databricks SQL Warehouse
Create an empty Databricks table (AsIs method)
Create an empty Databricks table
Create an empty Databricks table (Id method)
Map R data types to Databricks SQL types
Disconnect from Databricks
Execute statement on Databricks
Check if table exists (AsIs method)
Check if table exists in Databricks
Check if table exists (Id method)
Fetch results from Databricks query
DBFS Storage Information
Get connection information
Execute SQL query and return results
Get number of rows fetched
Get number of rows affected (not applicable for SELECT)
Get SQL statement from result
Check if query has completed
Check if connection is valid
List column names of a Databricks table (AsIs method)
List column names of a Databricks table
List tables in Databricks catalog/schema
Declare dbplyr API version for Databricks connections
Quote identifiers for Databricks SQL
Quote complex identifiers (schema.table)
Quote SQL objects (passthrough)
Read a Databricks table (AsIs method)
Read a Databricks table
Read a Databricks table (Id method)
Remove a Databricks table (AsIs method)
Remove a Databricks table
Remove a Databricks table (Id method)
Rollback transaction (not supported)
Send query to Databricks (asynchronous)
Send statement to Databricks
Write table to Databricks (AsIs name signature)
Write a data frame to Databricks table
Write a data frame to Databricks table (Id method)
Returns the default config profile
Delta Sync Vector Search Index Specification
Determine brickster virtualenv
Delta Sync Vector Search Index Specification
Docker Image
Email Notifications
Embedding Source Column
Embedding Vector Column
File Storage Information
For Each Task
GCP Attributes
Generate unique temporary table/view name
Get and Start Cluster
Get and Start Warehouse
Get Latest Databricks Runtime (DBR)
Git Source for Job Notebook Tasks
Detect if running within Databricks Notebook
Init Script Info
Test if object is of class AccessControlRequestForGroup
Test if object is of class AccessControlRequestForUser
Test if object is of class AccessControlRequest
Test if object is of class AwsAttributes
Test if object is of class AzureAttributes
Test if object is of class AutoScale
Test if object is of class ClusterLogConf
Test if object is of class ConditionTask
Test if object is of class CronSchedule
Test if object is of class DbfsStorageInfo
Test if object is of class DeltaSyncIndex
Test if object is of class DirectAccessIndex
Test if object is of class DockerImage
Test if object is of class JobEmailNotifications
Test if object is of class EmbeddingSourceColumn
Test if object is of class EmbeddingVectorColumn
Test if object is of class FileStorageInfo
Test if object is of class ForEachTask
Test if object is of class GcpAttributes
Test if object is of class GitSource
Test if object is of class InitScriptInfo
Test if object is of class JobTaskSettings
Test if object is of class CranLibrary
Test if object is of class EggLibrary
Test if object is of class JarLibrary
Test if object is of class MavenLibrary
Test if object is of class PyPiLibrary
Test if object is of class WhlLibrary
Test if object is of class Libraries
Test if object is of class Library
Test if object is of class NewCluster
Test if object is of class NotebookTask
Test if object is of class PipelineTask
Test if object is of class PythonWheelTask
Test if object is of class RunJobTask
Test if object is of class S3StorageInfo
Test if object is of class SparkJarTask
Test if object is of class SparkPythonTask
Test if object is of class SparkSubmitTask
Test if object is of class SqlFileTask
Test if object is of class SqlQueryTask
Test if object is of class JobTask
Test if object is of class VectorSearchIndexSpec
Job Task
Job Tasks
Cran Library (R)
Egg Library (Python)
Jar Library (Scala)
Maven Library (Scala)
PyPi Library (Python)
Wheel Library (Python)
Libraries
New Cluster
Notebook Task
Connect to Databricks Workspace
Pipeline Task
Python Wheel Task
Reads Databricks CLI Config
Reads Environment Variables
Remove Library Path
Run Job Task
S3 Storage Info
Show method for DatabricksConnection
Show method for DatabricksDriver
Show method for DatabricksResult
Spark Jar Task
Spark Python Task
Spark Submit Task
SQL File Task
SQL Query Fields for Databricks connections
Create temporary views and tables in Databricks
SQL Query Task
Handle table analysis for Databricks
SQL translation environment for Databricks SQL
Returns whether or not to use a .databrickscfg file
Wait for Libraries to Install on Databricks Cluster
Extract warehouse ID from an http_path
Collection of utilities that improve using 'Databricks' from R. Primarily functions that wrap specific 'Databricks' APIs (<https://docs.databricks.com/api>), 'RStudio' connection pane support, quality of life functions to make 'Databricks' simpler to use.