- In your Databricks Workspace, go to the Workflows pane.
- Click the Monte Carlo Job - it should be prefixed with
- On the right pane, under Compute, click the Swap button.
- In the pop-up box, click New job cluster.
- Create the job cluster with your desired settings. If you prefer the most inexpensive recommendations, see below. If the job is taking too long at these settings, we can up the cluster size or number of workers.
Nodes: Single node
Node type: i4i.large (cheapest AWS cluster available)
Important 🚨: Under Advanced options, on the Spark tab, append these two arguments under Spark config:
spark.databricks.clusterSource API spark.databricks.hive.metastore.client.pool.size 40
- Click Confirm and then Update.
- Under Permissions, make sure the Service Principal or User (for Personal Access Token) is selected as
- Verify that the job is able to run with the permissions available.
Updated 5 days ago