We have created a resource group in an internal subscribtion for you. All what you have to do is to access the portal.azure.com : and then search for the resource group RG_[alias] where [alias] is [email protected] in your email address.
If you couldn't locate the RG; please contact us.
All resources you are creating should be inside that resource group.
- Create your own virtual network.
- Provision an Azure Databricks workspace injected in that virtual network previously created.
- Create one or more clusters on your workspace that will allow you to complete the next steps.
- Create an ADLS Gen 2 Storage account in azure portal.
- Create a container in that storage and upload a random csv file to this container.
- Access/read the csv file as a dataframe in databricks using spark.
- Create an external and managed delta table in the Databricks workspace.
- Create a Databricks Workflow which triggers a notebook with a streaming scenario perhaps…
- Trigger the previously created workflow using the Databricks Jobs API.
- Register a serve a model. Get predictions from the served model.