Skip to main content

Apache Spark on Databricks

Connection Type


Distributions Validated On


Server Details

Databricks information can be found here.

Type of Support


Validated On

ODBC Client Version:

Alteryx Tools Used to Connect

In-database Workflow Processing

Blue icon with database being plugged in.
Blue icon with a stream-like object flowing into a database.
Apache Spark Code Tool Icon

Connect to Apache Spark by dragging a Connect In-DB tool or the Apache Spark Code tool onto the canvas. Create a new connection to Databricks using the Apache Spark on Databricks driver. Use the instructions below to configure the connection. See Databricks documentation for more information.

Configure the Databricks Connection window

To connect to Databricks and create an Alteryx connection string...

  1. Enter your Account ID or Region.

    • For Databricks hosted on AWS, enter your Databricks Account ID. You can retrieve the Account ID in the Databricks account console by selecting the down arrow next to your username in the upper right corner.

    • For Databricks hosted on Azure, enter your Region. Region is the Azure Data Center location. It can be provided by your Databricks Admin.

  2. Paste the Databricks Token you generated in your Databricks user settings. Tokens can expire and be revoked.

  3. Select Connect. Designer displays a list of Databricks clusters to connect to. If the connection is not successful, try entering your credentials again.

  4. Select a Databricks Cluster to connect to.

  5. Select a SessionType. Select one of these, depending on the code you are writing:

    • Scala

    • Python

    • R

  6. Optionally, enter a descriptive Runname for the job so that you can identify it later. Run names help users distinguish one job from another on the server. The name defaults to Untitled if left blank.

  7. Set the Timeout in number of minutes. This is the number of minutes of non-activity before the job stops. If you enter 15 minutes, the job can sit idle without any activity for 15 minutes before it times out. See Databricks documentation for more information.

  8. Add Libraries in addition to the set of libraries that is already provided to write your own code.

    File Type



    Java ARchive


    Single-file importable distribution format for Python-related projects.


    Python Package Index is a repository of software for Python.


    A repository for files and artifacts.


    R File Package

    Select the "+" icon to add a row. Select Save to save the library configuration settings to a file. Use the Filefoldericon to locate a saved configuration file. To delete a row, hover over it and then select the trash icon.

  9. In DatabricksConnection, select OK.

  10. In ManageIn-DB Connections, select OK to create an Alteryx connection string.