COURSE OVERVIEW

Hexagons Background

Training Module:

JOBS

Jobs are used to schedule transfers and executions for all instance types such as, ODX, MDW, SSL

Geometrical Illustration

Table of ContentS

  • Key Concepts: Jobs
    • What are Jobs?
    • Create a New Execution Package
  • Using Jobs
    • Using Jobs
    • (optional) Manage Jobs
    • (optional) Enable On-Demand
  • Execution Queue
    • ODX Execution Queue
    • Execution Queue
    • Object Dependencies
  • PowerShell + Jobs
    • What are External Executions?
    • (optional) External Executions (PowerShell)
    • (optional) Example PowerShell Scripts
  • Jobs API Endpoints
    • What are TimeXtender API Endpoints?
    • (optional) API Endpoints & Postman Collection
Hexagons Background

Key Concepts: Jobs

What are Jobs?

Jobs are the way to schedule and automate tasks and execution packages for

Operational Data Exchange (ODX), Data Warehouse (MDW), and Semantic Model (SSL) instances.

types of JobS in TimeXtenDer

1

Jobs to Schedule ODX Tasks

For ODX instances, jobs are used to schedule tasks, transfer, synchronization, and storage management.

Task List
Plus Sign Icon
Clock Icon Symbol Sign

ODX TaskS

Schedule

When to Use: to schedule ODX Tasks and update the raw data stored in your data lake or on-premise SQL Server Storage.


For example, scheduling the ODX (only) is ideal for Incremental Loads, large table transfers to be scheduled outside of busy hours, and Storage Clean Up Tasks.


Learn more about Tasks in ODX.

Transfer Tasks load data

data sources

ODX Storage

2

Jobs to Schedule Execution Packages (and more)

For Data Warehouse and Semantic Model instances, jobs are used to schedule execution packages. This uses Data On-Demand, rather than ODX Tasks.

Monoline Package Received Icon
Plus Sign Icon
Clock Icon Symbol Sign

ExecutioN Packages

Schedule

When to Use: to schedule end-to-end data refresh (i.e. ODX, MDW, SSL, and Semantic Endpoints)


For example, using jobs to run Execution Packages are perfect for “refreshing” a dashboard with the latest data especially with the “Data On-Demand” feature.


Learn more about Execution Packages and Data On-Demand.


Please check out Scheduling Executions using Jobs to learn more.

Create A New Execution Package

An execution package determines which objects in an instance will be executed and how.

Create aN Execution Package (MDW)

  1. Under Data Warehouses, right-click Execution, then click Open
    1. For Semantic Models (only), right-click Execution under Semantic Models, then click Open
  2. Right-click Execution Packages, then click Add Execution Package
  3. Configure the new Execution Package
    1. Add Name
    2. Increase Max Threads to desired number, based on your environment and processing power
    3. Change Managed Execution to “ExecutionTime”

Edit aN Existing Execution Package (MDW)

  1. Right-click the Execution Package to configure, and select Edit Execution Package, or
  2. Make the desired modifications to the package, and select OK.

For more details about the Execution Package configuration options, please check out Configuring Execution Packages

Hexagons Background

Using Jobs

Using JobS

Since there are essentially two (2) types of Jobs in TimeXtender, and they are mutually exclusive,

meaning that they cannot be combined, nor do they need to be, as the serve different purposes.

for End-to-End Scheduling Automation,

Configure Job with MDW and SSL Instances

  1. Right-click the Jobs node in the left-hand pane
  2. Select Add Job.
  3. Provide a name for the job, click Next, and select an instance for the job.
  4. Select the Execution Packages to include in the job, then click Add (for each). Click Next.
  5. Add schedule settings to your preference, and click Finish.

for ODX data Refresh (only),

Configure Job with ODX Instances

  1. Right-click the Jobs node in the left-hand pane
  2. Select Add Job.
  3. Provide a name for the job, click Next, and select an instance for the job.
  4. Select the ODX Tasks to include in the job, then click Add (for each). Click Next.
  5. Add schedule settings to your preference, and click Finish.

WARNING: Only tasks relating to one ODX instance can be added to a job. In other words, tasks from different ODX instances cannot be added to the same job.


For more information on tasks, see Tasks in an ODX instance.

Manage Jobs

OPtional

Edit (or Delete) a Job

  1. Right-click Jobs, and select the action you require.
    1. Click Edit Job
    2. Or, click Delete Job

Edit Job Schedule

  1. Right-click Jobs, and select Manage Schedule

Monitor Jobs

  1. Right-click Jobs, and select Monitor...
  2. The Monitor pane shows if a job is valid, the state of the job, its execution state, and the last execution date.

Enable On-Demand

OPtional

What is “Data on Demand”?


Data On-Demand is an advanced setting on a data source, The idea is to simplify data transfers and updates using a single job, without creating or using an ODX Transfer Task*. This means that you will only create the End-to-End data refresh, ensure that you’ve enabled the data source ‘Data on demand’ feature, and schedule the Job.


When to use: to schedule end-to-end data refresh (and “bypass” the ODX tasks and logs).


*Please note: when data is loaded using “data on demand”, the ODX will NOT create log entries, due to the way that the data is queried, “bypassing” the ODX tasks and logs, and allowing for minimal setup and configuration. If you prefer to have ODX execution logs for each data refresh, you should NOT use the ‘Data on demand’ feature’, but rather create two (2) jobs: ODX Job + End-to-End Job. By using these two (2) distinct jobs together, the ODX Execution logs will be created,


Learn more about Data On-Demand.


Enable “Data on Demand” in Desktop

  1. On TX Desktop, right-click your data source, and select Edit Data Source
  2. Click on Advanced Settings
  3. Check the box ‘Data on demand

WARNING: Data on demand does NOT support:

  • ODX Execution Logs (see above)
  • ADF Data sources


See: Data On-Demand.

Hexagons Background

Execution Queue

Execution Queue

ODX Execution Queue

  1. Be sure the ODX instance is open and in the Tools menu, click ODX Execution Queue.
  2. In the Execution Queue, you can see the tasks that are currently executing, waiting to start or just finished executing.

Execution queue can also be viewed by right clicking ODX Instance-> Select View Execution Queue

Execution Queue

  1. Right-click on a table or data area and select Execute.
  2. Check the box - Send to execution queue

Object Dependencies

Object dependencies

  1. Right click on the desired table-> Advanced-> Object Dependencies.
  2. In the pop-up select the desired table/view/stored procedure dependency.
  3. Click OK.

Object dependencies are identified by selecting the tables that should be executed prior to the selected table. This may help avoid a deadlock.

Hexagons Background

PowerSHell + Jobs

What are External Executions?

What are External Executions?

An "external execution" is a separate program or script designed to carry out specific tasks related to data, which “extends” the native functionality of a program or software offering. By using External Executions, a developer can leverage the capabilities of Jobs + PowerShell to manage infrastructure and storage, advanced data analysis, or more complex operations.

Task List

Jobs

Plus Sign Icon
PowerShell a Task-Based Command-Line Shell and Scripting Language

Powershell

For example, data engineers may use these external executables to augment and automate data pre-processing or post-processing steps, as a PowerShell script, as a part of a Job in TimeXtender.


  1. Infrastructure automations like modifying an Azure SQL DB (“sizing”), from scaling-up (for more processing power during peak usage times), to scaling-down (for cost savings during low usage times).
  2. Analytics & Processing automations like processing an Analysis Services Database (“full refresh), or to Pause PowerBI Embedded.

External Executions (Powershell)

Create & execute an External Executable

  1. In your Data Warehouse, right-click the Execution and select Open.
  2. Right-click External Executables and select Add Execute PowerShell Script Step.
  3. Add Name, select the desired Timeout duration (minutes), and enter the script in the Script pane.
  4. Click Execute, a message will displayed in the results pane, which will indicate if the script was executed successfully or results in an error.

Edit External Executable

  1. Under External Executables, right-click the script you’d like edit and select Edit Execute PowerShell Script Step.

Add External Executable to Execution Package

  1. Create a new Execution Package or Edit an existing one. Do NOT modify the Default Execution Package, instead create a new one for the sake of modification.
    1. In this example, you’ll create a new Execution Package “RefreshSSAS”.
  2. Click and drag the External Executable to Include Steps. The change will take effect on the next execution of that Execution Package.

Pro-Tip: Best practice is to NOT change the behavior of the Default Execution Package, as this is used when manual executions (i.e. during development tasks), and may have unintended or unexpected consequences for your solution. When in doubt, create a new execution package, or ask your Partner or SSP.


Read more about creating Execution Packages.

Example PowerShell Scripts

OPtional

To see the complete list of example PowerShell scripts, please check out the Knowledge Base article on the topic, Execute PowerShell Script as External Executable.


Here are three (3) examples from that article to give you a preview.

System Icon

Process

external DB

Example Powershell Script:

Process Analysis Services Database

Invoke-ProcessASDatabase `

-Server "localhost" `

-DatabaseName "SalesOnPrem" `

-RefreshType "Full"

Growth icon. Profit growing icon. Growing graph symbol.

Increase

Processing Power

Example PowerShell Script: Scale Up Azure SQL DB

Set-AzSqlDatabase `

-ResourceGroupName <ResourceGroupName> `

-ServerName <ServerName> `

-DatabaseName <DB Name> `

-Edition "Premium" `

-RequestedServiceObjectiveName "P1"

decrease

Decrease Cost

Example Powershell Script: Scale Down Azure SQL DB

Set-AzSqlDatabase `

-ResourceGroupName <ResourceGroupName> `

-ServerName <ServerName> `

-DatabaseName <DB Name> `

-Edition "Basic" `

Hexagons Background

Jobs API Endpoints

What are Jobs API Endpoints?

The purpose of these API endpoints is to provide a programmatic way to interact with and manage data jobs in Timextender. Jobs API endpoints simplify programmatic access and automation for data estate management and orchestration.


For a complete list of the API Endpoints and parameters, please see the Knowledge Base article, TimeXtender API Endpoints.

Task List

Jobs

Plus Sign Icon
Api icon

API

For example, data engineers may use these API Endpoints to augment and automate Jobs-related work. Here are some examples.


  1. Job Execution allow users to The API endpoints allow you to execute data jobs programmatically. This can be useful for automating the execution of data jobs on a schedule or integrating data jobs with other systems.
  2. Job Monitoring allow users to to monitor the status and logs of data jobs. This can help you track the execution of data jobs and identify any issues or errors that may occur.

API Endpoints & Postman collection

OPtional

Import Jobs collection

  1. Download the jobs collection - TimeXtender Postman collection
  2. Click on import and drop the downloaded collection.
  3. Open the Public-Jobs to see all the job requests.

prerequisities

  1. In Postman, select the Public - Jobs collection that has been imported in the left sidebar
  2. Select the Variables tab
  3. Enter the API key in the Current value for the variable apiKey (for more info on how to create an API key see API Key Management)


Get all jobs endpoint

  1. Enter the domain name in the URL and your TimeXtender generated API Key under the headers tab.
  2. Click Send, and review Name and save the Job ID

Exercise Schedule SSL Data Refresh

  1. Right click on Jobs from the Solution Explorer select Add Job.
  2. Enter the Job Name click Next.
  3. Select the Semantic Model instance and click Next.
  4. Select the execution package from the available list and click Next.
  5. Enter the scheduling information for the job and click Finish.


Section Quiz...

Planning to take the Solution Architect exam? Then you want to be sure these questions are easy to answer.

True or False: You can add tasks from different ODX instances to the same job in TimeXtender.

How many types of Jobs are there in TimeXtender?

True or False: External Executions allow Users to run PowerShell scripts, as part of Execution Packages?

Brushstroke Arrow Smooth Curve Down Small

When you're ready, see Answers Below

Section Quiz Answers

Planning to take the Solution Architect exam? Then you want to be sure these questions are easy to answer.

True or False: You can add tasks from different ODX instances to the same job in TimeXtender.


False. Only tasks relating to one ODX instance can be added to a job, meaning tasks from different ODX instances cannot be added to the same job.

How many types of Jobs are there in TimeXtender?



There are two (2) types of Jobs in TimeXtender, for End-to-End Jobs (On-Demand), and ODX-only Jobs.

True or False: External Executions allow Users to run PowerShell scripts, as part of Execution Packages?


True! Users can run PowerShell scripts as part of Execution Packages in TimeXtender through the use of External Executions.

want to Learn even more?

Learn even more data loading techniques from TimeXtender Tuesdays

Congratulations! You've completed the training module

JOBS

give FEEDBACK

Thumbs Up Illustration
Thumbs Up Illustration