Requirements
The Azure Administrator will be required for this step. The user must have:
Ownerrole for the subscription- Access to the Azure Portal
Step 1: Sign into Azure
Use the correct URL for your environment:
| Environment | Login URL |
|---|---|
| Worldwide (Commercial) | https://portal.azure.com |
| GCC | https://portal.azure.com |
| GCC High | https://portal.azure.us |
| DoD | https://portal.azure.us |
Step 2: Create SQL Database
- Type in
azure sqlin the top search textbox- Select
Azure SQL Databasefrom the search results
3. Click on
Create
4. Set the following properties, and click on
Create newunder theServerproperty
| Name | Value |
|---|---|
| Subscription | The subscription created for this toolkit |
| Resource Group | sub-spark-rg |
| Database Name | sqldb-spark |
5. Set the following properties and click on
OK
| Name | Value |
|---|---|
| Server Name | sub-sql-spark-svr |
| Location | The same region you have been using for the Azure services |
| Authentication Method | Use Microsoft Entra-only Authentication |
| Set Microsoft Entra admin | SPARK System Admins |
6. Set the following properties, and click on
Next
For Tenant Sizes: < 100k sites
| Name | Value |
|---|---|
| Compute + storage | Standard S0 2GB |
| Backup storage redundancy | Locally-redundant backup storage |
For Tenant Sizes: > 100k sites
| Name | Value |
|---|---|
| Compute + storage | Standard S1 5GB |
| Backup storage redundancy | Locally-redundant backup storage |
[!NOTE] For tenants > 150k sites during intake and inventory runbook jobs, it’s recommended to update the compute and storage to Premium P1. Return to the Standard S1 after the jobs have completed.
7. Set the following properties, and click on
Next
| Name | Value |
|---|---|
| Connectivity method | Public endpoint |
| Allow Azure services and resources to access this server | Yes |
8. Click on
Configure Identitiesfor theServer identityproperty9. Add the
uami-spark-spoactionsidentity10. Select the
uami-spark-spoactionsidentity as thePrimary Identity11. Click on
Apply
[!NOTE] If this section is not available during creation, you can set this property later.
12. Set
Enable secure enclavestoOn13. Click on
NextuntilReview + create14. Click on
Create
Step 3: Configure SQL Access Control
- Access the
Azure SQLlogical servers- Click on
SQL logical servers- Select the
sub-sql-spark-svrserver created in the previous step- Select the
Access control (IAM)and click onAdd role assignment
5. Search for
sql db6. Select
SQL DB Contributorand click onNext
7. Select
Managed identityfor theAssign access to8. Select the
uami-spark-spoactionsidentity, and click onSelect9. Click on
Review + assign, and thenReview + assignto add the role assignment
Step 4: Configure SQL UAMI Identity
- Select
SQL databasesfrom the left navigation- Select the
sub-sql-spark-svrdatabase- Under
Security, click onIdentity- Select
Add, select theuami-spark-spoactionsidentity and then click onAdd- Click
Save
Step 5: Get SQL Connection String
- Click on
SQL databasesfrom the left navigation- Select the
sqldb-sparkdatabase- Under
Overview, click onSee connection strings- Annotate the
Microsoft Entra passwordless authenticationdatabase connection string
Update Connection String
Replace TrustServerCertificate=False;Connection Timeout=30;Authentication="Active Directory Default"; with MultipleActiveResultSets=True;
1
Server=tcp:sub-sql-spark-svr-qa1.database.windows.net,1433;Initial Catalog=sub-sql-spark-svr-qa1;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Authentication="Active Directory Default";
1
Server=tcp:sub-sql-spark-svr.database.windows.net,1433;Initial Catalog=sqldb-spark;Encrypt=True;MultipleActiveResultSets=True;
[!NOTE] Annotate the following variables in the template spreadsheet:
- v_sqlConnectionString: The sql connection string
Step 6: Connect to SQL Database
- Click on
SQL databasesfrom the left navigation- Select the
sqldb-sparkdatabase- Under
Query editor, click onAllow IP [IP Address]...to be able to connect to the database- Click on
Continue as [user account]to access the query editor
[!NOTE] The user must be part of the SPARK System Admins group
Step 7: Run SQL Queries
- From the query editor, run the following scripts in the order shown in the table below
- Confirm each query runs successfully
| Name | Files |
|---|---|
| Create Users | 01_create_spark_sql_users |
| Create Tables | 02_create_spark_sql_tables |
| Create Views | 03_create_spark_sql_views |
| Create Stored Procedures | 04_create_spark_sql_procs |