Product and Technology

Connect & Use Apache Spark/Databricks

Written by TimeXtender | February 13, 2023

TimeXtender Tuesday - Episode 34

Join our expert Solution Specialist, Frank Wagner, as he shows you how blazing fast it is to connect Apache Spark, Databricks, and TimeXtender. Follow along as he goes through getting the raw data to the data lake, and getting the curated data into the data warehouse to provide business insights to users so they can make confident, data-driven decisions. This process also gives Data Scientists the access and data they need to perform the advanced analyses they need with both curated and raw data, bringing together the best of both worlds.

Great news! Timextender is now FREE! If you are ready to get started modeling data and providing data-based insights to your business you can start your free plan today.

Visit now: https://www.timextender.com/freemium

If you'd like to review previous TimeXtender Tuesdays sessions, check out our TimeXtender Tuesday playlist on YouTube.

 

Other helpful links Frank mentioned:

About Azure Key Vault: https://learn.microsoft.com/en-us/azure/key-vault/general/overview
 
Create Azure Key Vault backed secret scope in Azure Databricks: https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes
 
Connect Spark to a SQL Server: https://learn.microsoft.com/en-us/azure/databricks/external-data/sql-server
 
Read from SQL Server with Spark: https://learn.microsoft.com/en-us/azure/databricks/external-data/sql-server
 
Mount Data Lake in Spark/Databricks: https://learn.microsoft.com/en-us/azure/databricks/dbfs/mounts
 
Read Parquet files with Azure Databricks: https://learn.microsoft.com/en-us/azure/databricks/external-data/parquet
 
Apache PySpark DataFrame API: https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/dataframe.html
 
Apache PySpark DataFrame join API: https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.join.html
 
 

TimeXtender Tuesday Links:

Taking it further: