Skip to main content

Pentaho+ documentation has moved!

The new product documentation portal is here. Check it out now at docs.hitachivantara.com

 

Hitachi Vantara Lumada and Pentaho Documentation

Before you begin

Before using the Bulk load into Azure SQL DB job entry in PDI, you must have the following items:

  • An Azure account.
  • A connection to the Azure SQL database in which to load your data.
  • A VFS connection to Azure Data Lake Storage.
  • A table and schema set up in the database where you want to place your data. The table must have defined all the columns you need. On the first use, you will need to create the table.
NoteThe INSERT and ADMINISTER DATABASE BULK OPERATIONS permissions are required In the Azure SQL database to use the Bulk Load into Azure SQL DB job entry.