Navigating the Azure Data Factory Studio
The Azure Data Factory Studio is the central tool for authoring ADF resources. There are several buttons on the home page that enable users to start building new workflows very quickly:
- The Ingest button, which navigates users to the Copy Data tool. This tool allows developers to quickly begin copying data from one data store to another
- The Orchestrate button, which navigates users to the Author page where they can begin building pipelines
- The Transform Data button, which opens a new page where developers can build a mapping data flow
- The Configure SSIS button, which navigates users to a new page where they can configure a new Azure-SSIS integration runtime
On the left side of the page there is a toolbar with four buttons, including a Home button that will navigate users back to the Azure Data Factory Studio home page. The following list describes how you can use the other buttons in the toolbar to build and manage ADF resources:
- The Author button opens the Author page where users can build and manage pipelines, datasets, mapping data flows, and Power Query activities. Figure 5.20 is an example of the Author page with a single activity pipeline that copies data from Azure SQL Database to ADLS.

FIGURE 5.20 Azure Data Factory Studio Author page
- The Monitor button opens a page that provides performance metrics for pipeline runs, trigger runs, and integration runtimes. Figure 5.21 is an example of the Monitor page.

FIGURE 5.21 Azure Data Factory Studio Monitor page
- The Manage button opens a page (see Figure 5.22) that allows you to perform several management tasks, such as those listed here:
- Create or delete linked services.
- Create or delete integration runtimes.
- Link an Azure Purview account to catalog metadata and data lineage.
- Connect the ADF instance to a Git repository.
- Create or delete pipeline triggers.
- Configure a customer managed encryption key and define access management for the ADF instance.

FIGURE 5.22 Azure Data Factory Studio Manage page
The following section, “Building an ADF Pipeline with a Copy Data Activity,” will detail how to create the activity, datasets, and linked services that are associated with the pipeline in Figure 5.20 (shown earlier). More specifically, it will demonstrate how to use the copy activity to copy data from an Azure SQL Database to an ADLS account. The source database is restored from the publicly available AdventureWorksLT2019 database backup. If you would like to build this demo on your own, you can find the database backup at https://docs.microsoft.com/en-us/sql/samples/adventureworks-install-configure?view=sql-server-ver15&tabs=ssms#download-backup-files.
Leave a Reply