Training Module:
JOBS
Jobs are used to schedule transfers and executions for all instance types such as, ODX, MDW, SSL
Table of ContentS
Key Concepts: Jobs
go to Table of Contents
What are Jobs?
Jobs are the way to schedule and automate tasks and execution packages for
Operational Data Exchange (ODX), Data Warehouse (MDW), and Semantic Model (SSL) instances.
types of JobS in TimeXtenDer
1
Jobs to Schedule ODX Tasks
For ODX instances, jobs are used to schedule tasks, transfer, synchronization, and storage management.
ODX TaskS
Schedule
When to Use: to schedule ODX Tasks and update the raw data stored in your data lake or on-premise SQL Server Storage.
For example, scheduling the ODX (only) is ideal for Incremental Loads, large table transfers to be scheduled outside of busy hours, and Storage Clean Up Tasks.
Learn more about Tasks in ODX.
Transfer Tasks load data
data sources
ODX Storage
2
Jobs to Schedule Execution Packages (and more)
For Data Warehouse and Semantic Model instances, jobs are used to schedule execution packages. This uses Data On-Demand, rather than ODX Tasks.
ExecutioN Packages
Schedule
When to Use: to schedule end-to-end data refresh (i.e. ODX, MDW, SSL, and Semantic Endpoints)
For example, using jobs to run Execution Packages are perfect for “refreshing” a dashboard with the latest data especially with the “Data On-Demand” feature.
Learn more about Execution Packages and Data On-Demand.
Please check out Scheduling Executions using Jobs to learn more.
Create A New Execution Package
An execution package determines which objects in an instance will be executed and how.
Create aN Execution Package (MDW)
Edit aN Existing Execution Package (MDW)
For more details about the Execution Package configuration options, please check out Configuring Execution Packages
Using Jobs
go to Table of Contents
Using JobS
Since there are essentially two (2) types of Jobs in TimeXtender, and they are mutually exclusive,
meaning that they cannot be combined, nor do they need to be, as the serve different purposes.
for End-to-End Scheduling Automation,
Configure Job with MDW and SSL Instances
for ODX data Refresh (only),
Configure Job with ODX Instances
WARNING: Only tasks relating to one ODX instance can be added to a job. In other words, tasks from different ODX instances cannot be added to the same job.
For more information on tasks, see Tasks in an ODX instance.
Manage Jobs
OPtional
Edit (or Delete) a Job
Edit Job Schedule
Monitor Jobs
Enable On-Demand
OPtional
What is “Data on Demand”?
Data On-Demand is an advanced setting on a data source, The idea is to simplify data transfers and updates using a single job, without creating or using an ODX Transfer Task*. This means that you will only create the End-to-End data refresh, ensure that you’ve enabled the data source ‘Data on demand’ feature, and schedule the Job.
When to use: to schedule end-to-end data refresh (and “bypass” the ODX tasks and logs).
*Please note: when data is loaded using “data on demand”, the ODX will NOT create log entries, due to the way that the data is queried, “bypassing” the ODX tasks and logs, and allowing for minimal setup and configuration. If you prefer to have ODX execution logs for each data refresh, you should NOT use the ‘Data on demand’ feature’, but rather create two (2) jobs: ODX Job + End-to-End Job. By using these two (2) distinct jobs together, the ODX Execution logs will be created,
Learn more about Data On-Demand.
Enable “Data on Demand” in Desktop
WARNING: Data on demand does NOT support:
See: Data On-Demand.
Execution Queue
go to Table of Contents
Execution Queue
ODX Execution Queue
Execution queue can also be viewed by right clicking ODX Instance-> Select View Execution Queue
Execution Queue
Object Dependencies
Object dependencies
Object dependencies are identified by selecting the tables that should be executed prior to the selected table. This may help avoid a deadlock.
PowerSHell + Jobs
go to Table of Contents
What are External Executions?
What are External Executions?
An "external execution" is a separate program or script designed to carry out specific tasks related to data, which “extends” the native functionality of a program or software offering. By using External Executions, a developer can leverage the capabilities of Jobs + PowerShell to manage infrastructure and storage, advanced data analysis, or more complex operations.
Jobs
Powershell
For example, data engineers may use these external executables to augment and automate data pre-processing or post-processing steps, as a PowerShell script, as a part of a Job in TimeXtender.
External Executions (Powershell)
Create & execute an External Executable
Edit External Executable
Add External Executable to Execution Package
Pro-Tip: Best practice is to NOT change the behavior of the Default Execution Package, as this is used when manual executions (i.e. during development tasks), and may have unintended or unexpected consequences for your solution. When in doubt, create a new execution package, or ask your Partner or SSP.
Read more about creating Execution Packages.
Example PowerShell Scripts
OPtional
To see the complete list of example PowerShell scripts, please check out the Knowledge Base article on the topic, Execute PowerShell Script as External Executable.
Here are three (3) examples from that article to give you a preview.
Process
external DB
Example Powershell Script:
Process Analysis Services Database
Invoke-ProcessASDatabase `
-Server "localhost" `
-DatabaseName "SalesOnPrem" `
-RefreshType "Full"
Increase
Processing Power
Example PowerShell Script: Scale Up Azure SQL DB
Set-AzSqlDatabase `
-ResourceGroupName <ResourceGroupName> `
-ServerName <ServerName> `
-DatabaseName <DB Name> `
-Edition "Premium" `
-RequestedServiceObjectiveName "P1"
Decrease Cost
Example Powershell Script: Scale Down Azure SQL DB
Set-AzSqlDatabase `
-ResourceGroupName <ResourceGroupName> `
-ServerName <ServerName> `
-DatabaseName <DB Name> `
-Edition "Basic" `
Jobs API Endpoints
go to Table of Contents
What are Jobs API Endpoints?
The purpose of these API endpoints is to provide a programmatic way to interact with and manage data jobs in Timextender. Jobs API endpoints simplify programmatic access and automation for data estate management and orchestration.
For a complete list of the API Endpoints and parameters, please see the Knowledge Base article, TimeXtender API Endpoints.
Jobs
API
For example, data engineers may use these API Endpoints to augment and automate Jobs-related work. Here are some examples.
API Endpoints & Postman collection
OPtional
Import Jobs collection
prerequisities
Get all jobs endpoint
Exercise Schedule SSL Data Refresh
Section Quiz...
Planning to take the Solution Architect exam? Then you want to be sure these questions are easy to answer.
True or False: You can add tasks from different ODX instances to the same job in TimeXtender.
How many types of Jobs are there in TimeXtender?
True or False: External Executions allow Users to run PowerShell scripts, as part of Execution Packages?
When you're ready, see Answers Below
Section Quiz Answers
Planning to take the Solution Architect exam? Then you want to be sure these questions are easy to answer.
True or False: You can add tasks from different ODX instances to the same job in TimeXtender.
False. Only tasks relating to one ODX instance can be added to a job, meaning tasks from different ODX instances cannot be added to the same job.
How many types of Jobs are there in TimeXtender?
There are two (2) types of Jobs in TimeXtender, for End-to-End Jobs (On-Demand), and ODX-only Jobs.
True or False: External Executions allow Users to run PowerShell scripts, as part of Execution Packages?
True! Users can run PowerShell scripts as part of Execution Packages in TimeXtender through the use of External Executions.
want to Learn even more?
Learn even more data loading techniques from TimeXtender Tuesdays
Congratulations! You've completed the training module
JOBS