Scheduling a Data Masking Job

To set up the data masking job and schedule its execution:

  1. Select the masking definition for which a script has been generated, then click Schedule Job.

  2. Change the default job name if desired and enter an optional job description.

  3. Select a database from the drop-down menu and indicate your preference:

    • In-Place Masking–to replace sensitive data in-place with masked data on a specified database (usually copied from production). Use this option only in nonproduction environments. This differs from the Actions menu option Clone Database, which clones the database and then masks the data.

    • At-Source Masking–to export masked data from the specified source database (usually production) using Oracle Data Pump. This option is safe to run in a production environment as it does not modify customer data. Note, however, that this option creates temporary tables that get dropped when the masking operation completes.

    Your selections affect the check box text that appears below the radio buttons as well as other regions on the page.

  4. Proceed according to your selections in Step 3:

    • Data Mask Options–Provide the requisite information as follows:

      • After script generation completes, the data masking script is stored in the Enterprise Manager repository. By default, the data masking job retrieves the script from the repository and copies it to the $ORACLE_HOME/dbs directory on the database host, using a generated file name. The Script File Location and Name fields enable you to override the default location and generated file name.

      • Workloads–Select options to mask SQL tuning sets and capture files, as appropriate. Browse to the file location where you want to capture the files.

      • Detect SQL Plan Changes Due to Masking–Run the SQL Performance Analyzer to assess the impact of masking. Provide a task name and browse to the corresponding SQL tuning set.

    • Data Export Options–Provide the requisite information as follows:

      • Specify a directory where to save the mask dump. The drop-down list consists of directory objects that you can access. Alternatively, you can select a custom directory path. Click the check box if you want to speed the process by using an external directory. Recommended default: DATA_FILE_DIR.

      • Specify appropriate values if you want to override the defaults: enter a name for the export file; specify a maximum file size in megabytes; specify the maximum number of threads of active execution operating on behalf of the export job. This enables you to consider trade-offs between resource consumption and elapsed time.

      • Select whether to enable dump file compression and encryption. Enter and confirm an encryption password, if appropriate. Log file generation is selected by default.

  5. Specify credentials to log in to the database host.

  6. Specify credentials to log in to the reference database.

  7. Specify to start the job immediately or at a later specified date and time, then click Submit.

    A message confirms that the job has been scheduled. Refresh the page to see the job results.