You can use the maven coordinate of the jar to auto install the Spark Connector to your Databricks Runtime from Maven: com.azure.cosmos.spark:azure-cosmos-spark_3-3_2 ...
Impatient and just want Jupyter with Apache Spark quickly? Place your notebooks under the notebook directory and optionally set your Python dependencies in your requirements.txt file. Then run: docker ...