Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

This project contains Python and HCL (terraform) scripts to create the automated business intelligence environment at Azure.

License

NotificationsYou must be signed in to change notification settings

brunocampos01/automated-business-intelligence-at-azure

Repository files navigation

This project provisioning all infrastructure to create environment data warehouse in Azure.
In addition, all scripts that support Azure Analysis Services are dynamically generated from a configuration file(set_variables.tfvars).

Below are the steps that must be followed to prepare the environment and run the automation:

  1. Clone this project to the user's main directory on the on-premises server, example:C:\Users\PRODUCT_NAME
cd ~
  1. At the initial moment, you need to create and configure a user at Azure. Follow thetutorial Azure to create a user that terraform will use.

  2. Once you have created a new user at Azure, set some parameters that will be used by the terraform. Open the fileset_variables.tfvars which is at the root of the project and edit the variables.

Example of how to configure the file set_variables:

# Azuresubscription_id="11111111ffff-111b-111b-8a2e-11111111ffff"client_id="11111111ffff-111b-111b-8552-11111111ffff"client_secret="11111111ffff-111b-111b-9606-11111111ffff"tenant_id="11111111ffff-111b-111b-a933-11111111ffff"6b02-494b-a933-1000020c0aad"location ="brazilsouth"tags =  {    analytics ="bi"    PRODUCT_NAME ="aa"}# User with access in portal Azure. Cannot be terraform userapplication_user_login ="application_user@PRODUCT_NAME.com.br"application_user_password ="111b111b-111bd"# Client and product nameclient_name ="AA"client_name_lower ="aa"product_name ="PRODUCT_NAME"product_client_name ="PRODUCT_NAME-MPAA"product_client_name_lower ="PRODUCT_NAME-mpaa"# If Runbooks fail, send emailemail_from ="smtp@company.com.br"email_to ="brunocampos01@gmail.com"smtp_server ="email-smtp.com"smtp_port ="587"# Data sourcedata_source ="oracle"# Analysis Serviceslarge_volume_table ="fact_historic"column_to_split ="id_column_name"total_month ="18"# Acess root in everyone databases of analysis serviceslist_admins_global = ["application_user@PRODUCT_NAME.com.br","brunocampos01@gmail.com"]# Thes logins must be separeted without spaceslist_admins ="brunocampos01@gmail.comr"# Thes logins must be separeted without spaceslist_readers ="brunocampos01@gmail.com,application_user@PRODUCT_NAME.com.br"
  1. Follow the step-by-step below to run automation correctly:
  • Open PowerShell asadmin
  • Run the following commands on PowerShell:
cd PRODUCT_NAME-mp/mainterraform init# download modules terraformterraform plan-var-file="..\set_variables.tfvars"-out plan.outterraform apply-var-file="..\set_variables.tfvars"-auto-approve-parallelism=1
  • From now on the provisioning and configuration of the services will start in an automated way. Go to Manual Service Provisioning in the Cloud.

Manual Service Provisioning in the Cloud

Once service provisioning has begun, you must perform a task within the cloud environment to provide scripting permission in the automation account. Follow the step-by-step below:

  1. Access theAzure
  2. Inside the portal go toHome > Automation-Accounts
  3. From the left menu, chooseRun as accounts and confirm the creation of the certificate.

On-premises Data Gateways

On-premises Data Gateway is required to transfer the data from the data source to Azure Analysis Services.

To set it up it must first be done on the machine running the local Gateway. Then, do the steps below:

  1. Access theAzure
  2. Wait about 15 minutes and then go toHome > On-premises Data Gateways
  3. Click on them+ Add
  • In the table below are configuration suggestions
ParametersValues
Resource Name<product_name>-<client_name>-gw
LocationBrazil South
Installation Name<product_name>-<client_name>-onpremisesgw

NOTES:

  • It is important to use the same account that was logged into the data gateway on the local server.
  • TheInstallation Name field is the name of the local gateway, configured on the local server. It should appear in the list when you click the down arrow.

Link Data Gateway with Azure Analysis Services

  1. Access theAzure
  2. Inside the portal go toHome > Analysis Services
  3. From the left menu, chooseOn-premises Data Gateway
  4. Now select the gateway from the list and then click onConnect Gateway

Data Processing for Azure Analysis Services

After all services are created at Azure, approximately 15 minutes, the automation account runs an update runbook of the Powershell modules and then runs the creation of the Azure Analysis Services database.

As of 40 minutes, the database is already created. To be able to process data you need to set the data source password via Visual Studio (SSDT). Follow the step-by-step below:

  1. Check that Azure Analysis Services is on. To do so, access theAzure

  2. Inside the portal go toHome > Analysis Services

If it's paused, just click theStart and wait for the service to begin.

  1. Copy the link as indicated in the image below.

  1. Open the SSDT and start a new projectto

  2. Choose the option:Analysis Services > Import Tabular and renamed the project name to the standard <product_name>-<client_name>

  1. The endpoint of the database will then be requested. Then at this time paste the link copied from Azure and wait for the environment preparation.

  2. In the right menu, open directoryData Sources and open the connection with 2 clicks. Enter the password of the source database and test the connection. If everything goes right, confirm and save the password..

  3. To avoid having to bring the data into Visual Studio (SSDT), change the following project properties.Debug > TabularProject1 Properties > Deployment Options > Processing Option > Do Not ProcessIn this same window, change the name of theServer field to Azure Analysis Services endpoint.'

  1. Build and deploy by clicking theStart button

  2. Finished. The Azure Analysis Services database can now process data from data sources.

Automated Processing

From 55 min, if the data source password has been set, all partitions except the large volume partition will be processed through the automation account. In the sequence, the automation account creates and processes the large volume table partitions, makes a backup, and turns off Azure Analysis Services.


Creative Commons License


[8]ページ先頭

©2009-2025 Movatter.jp