Design Patterns for Database API Testing 4: REF Cursor Getter

Last October I gave a presentation on database unit testing with utPLSQL, Oracle Unit Testing with utPLSQL. I mentioned design patterns as a way of reducing the effort of building unit tests and outlined some strategies for coding them effectively.

In the current set of articles, I develop the ideas further, starting from the idea that all database APIs can be considered in terms of the axes:

  • direction (i.e. getter or setter, noting that setters can also 'get')
  • mode (i.e. real time or batch)

For each cell in the implied matrix, I construct an example API (or view) with specified requirements against Oracle's HR demo schema, and use this example to construct a testing program with appropriate scenarios as a design pattern. Concepts and common patterns and anti-patterns in automated API testing are discussed throughout, and these are largely independent of testing framework used. However, the examples use my own lightweight independent framework that is designed to help avoid many API testing anti-patterns. The code is available on GitHub here, BrenPatF/trapit_oracle_tester, and includes both framework and design pattern examples.

Behind the four examples, there is an underlying design pattern that involves wrapping the API call in a 'pure' procedure, called once per scenario, with the output 'actuals' array including everything affected by the API, whether as output parameters, or on database tables, etc. The inputs are also extended from the API parameters to include any other effective inputs. Assertion takes place after all scenarios and is against the extended outputs, with extended inputs also listed. This concept of the 'pure' function, central to Functional Programming, has important advantages in automated testing. I explained the concepts involved in a presentation at the Oracle User Group Ireland Conference in March 2018:

The Database API Viewed As A Mathematical Function: Insights into Testing

The previous example is here: Design Patterns for Database API Testing 3: Batch Loading of Flat Files. This fourth example covers getting data in real time via a REF cursor, such as would often be used as the source for a web service.

Requirement Summary

Modern Oracle SQL is very powerful and can apply complex logic within a single statement, reducing the need for more complex procedural code. In order to show how to API test SQL that might be used in a batch getter module, we previously devised a test view, HR_Test_V, having a range of features that we might want to test in general. We can use similar SQL to demonstrate API testing of a real time getter procedure that might be used by a web service, where reference cursors are often used as output parameters. The following list of features to test is taken, in slightly modified form, from Design Patterns for Database API Testing 2: Views 1 - Design

  • Inner joins suppress driving records where there is no joining record
  • Outer joins return driving records where there is no joining record
  • Analytic functions that partition by some key, and return aggregates on the returned record set
  • Functions based on aggregates over records that include those not in the returned record set
  • Constraints based on aggregates over records that include those not in the returned record set
  • Constraints on column values

The SQL functionality can be described in words as:

  • Selected values
    • Employee name, department name, and salary
    • Manager's name
    • Ratio of employee's salary to the department average (returned employees only)
    • Ratio of employee's salary to the average salary of all employees
  • Constraints
    • Exclude employees in job 'AD_ASST'
    • Return employees for a department passed as a bind parameter
    • Do not return any records if the total salary of all employees is below 1600
  • Outer join
    • Include employees both with and without a manager

The REF cursor SQL is:

  WITH all_emps AS (
        SELECT Avg (salary) avg_sal, SUM (salary) sal_tot_g
          FROM employees e
)
SELECT e.last_name, d.department_name, m.last_name manager, e.salary,
       Round (e.salary / Avg (e.salary) OVER (PARTITION BY e.department_id), 2) sal_rat,
       Round (e.salary / a.avg_sal, 2) sal_rat_g
  FROM all_emps a
 CROSS JOIN employees e
  JOIN departments d
    ON d.department_id = e.department_id
  LEFT JOIN employees m
    ON m.employee_id = e.manager_id
 WHERE e.job_id != 'AD_ASST'
   AND a.sal_tot_g >= 1600
   AND d.department_id = :1

Notes on API Testing REF Cursor Procedures

  • A new utility function has been added to Utils_TT, Cursor_to_Array, that converts an open reference cursor to a delimited list of strings (created from an initial stand-alone procedure: A Utility for Reading REF Cursors into a List of Delimited Strings)
  • Using this utility, very little code needs to be written once the test data has been set up: One call to return the reference cursor, and a second to return the actual values in a list, to be passed at the end in a single call to the library results checker

ERD

unit-testing-three-erd_rc

Design Pattern Groups

The API testing framework is centred around the concept of input and output groups, representing the data sets that respectively form the inputs to, and outputs from, the program. The records in each group are printed by the framework with column headers for each scenario. These groups are identified by the developer, and in this case they are as noted below.

Input Groups

  • Employees Table
  • Department Parameter

Output Groups

  • Select results
  • Timing of average call

Test Scenarios

The scenario descriptions start with a data set code, followed by a verbal description.

  1. DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep (10) - pass dep 10
  2. DS-2, testing same as 1 but with extra emp in another dep (20) - pass dep 10
  3. DS-2, as second scenario, but - pass dep 20
  4. DS-2, as second scenario, but - pass null dep
  5. DS-3, Salaries total 1500 (< threshold of 1600, so return nothing) - pass dep 10

Package Structure Diagram

Call Structure Table - TT_Emp_WS.tt_AIP_Get_Dept_Emps

TT_Emp_WS.tt_AIP_Get_Dept_Emps - Emp REF Cursor Test Procedure

Declare section

PROCEDURE tt_AIP_Get_Dept_Emps IS

  c_proc_name           CONSTANT VARCHAR2(61) := 'TT_Emp_WS.tt_AIP_Get_Dept_Emps';
  c_dep_id_1            CONSTANT PLS_INTEGER := 10;
  c_dep_id_2            CONSTANT PLS_INTEGER := 20;
  c_dep_nm_1            CONSTANT VARCHAR2(100) := 'Administration';
  c_dep_nm_2            CONSTANT VARCHAR2(100) := 'Marketing';
  c_job_bad             CONSTANT VARCHAR2(100) := 'AD_ASST';
  c_job_good            CONSTANT VARCHAR2(100) := 'IT_PROG';
  c_base_sal            CONSTANT PLS_INTEGER := 1000;
  c_out_group_lis       CONSTANT L1_chr_arr := L1_chr_arr ('Select results');

  c_ln_pre              CONSTANT VARCHAR2(10) := DML_API_TT_HR.c_ln_pre;

  c_dep_lis             CONSTANT L1_chr_arr := L1_chr_arr (c_dep_id_1, c_dep_id_1, c_dep_id_2, NULL, c_dep_id_1);

  c_dataset_3lis        CONSTANT L3_chr_arr := L3_chr_arr (
                             L2_chr_arr (L1_chr_arr ('4 emps, 1 dep (10), emp-3 has no dep, emp-4 has bad job'),
--                                         dep           job          salary
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '1000'),
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '2000'),
                               L1_chr_arr (NULL,         c_job_good,  '3000'),
                               L1_chr_arr (c_dep_id_1,   c_job_bad,   '4000')
                                             ),
                             L2_chr_arr (L1_chr_arr ('As dataset 1 but with extra emp-5, in second dep (20)'),
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '1000'),
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '2000'),
                               L1_chr_arr (NULL,         c_job_good,  '3000'),
                               L1_chr_arr (c_dep_id_1,   c_job_bad,   '4000'),
                               L1_chr_arr (c_dep_id_2,   c_job_good,  '5000')
                                             ),
                             L2_chr_arr (L1_chr_arr ('As dataset 2 but with salaries * 0.1, total below reporting threshold of 1600'),
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '100'),
                               L1_chr_arr (c_dep_id_1,   c_job_good,  '200'),
                               L1_chr_arr (NULL,         c_job_good,  '300'),
                               L1_chr_arr (c_dep_id_1,   c_job_bad,   '400'),
                               L1_chr_arr (c_dep_id_2,   c_job_good,  '500')
                                             )
                        );

  c_exp_2lis            CONSTANT L2_chr_arr := L2_chr_arr (
                                               L1_chr_arr (
                                       Utils.List_Delim (c_ln_pre || '1',   c_dep_nm_1, NULL,            '1000', '.67',   '.4'),
                                       Utils.List_Delim (c_ln_pre || '2',   c_dep_nm_1, c_ln_pre || '1', '2000',  '1.33', '.8')
                                               ),
                                               L1_chr_arr (
                                       Utils.List_Delim (c_ln_pre || '1',   c_dep_nm_1, NULL,            '1000', '.67',  '.33'),
                                       Utils.List_Delim (c_ln_pre || '2',   c_dep_nm_1, c_ln_pre || '1', '2000',  '1.33', '.67')
                                               ),
                                               L1_chr_arr (
                                       Utils.List_Delim (c_ln_pre || '5',   c_dep_nm_2, c_ln_pre || '1', '5000',  '1',    '1.67')
                                               ),
                                               Utils_TT.c_empty_list,
                                               Utils_TT.c_empty_list
                        );

  c_scenario_ds_lis     CONSTANT L1_num_arr := L1_num_arr (1, 2, 2, 2, 3);
  c_scenario_lis        CONSTANT L1_chr_arr := L1_chr_arr (
                               'DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep (10) - pass dep 10',
                               'DS-2, testing same as 1 but with extra emp in another dep (20) - pass dep 10',
                               'DS-2, as second scenario, but - pass dep 20',
                               'DS-2, as second scenario, but - pass null dep',
                               'DS-3, Salaries total 1500 (< threshold of 1600, so return nothing) - pass dep 10');

  c_inp_group_lis       CONSTANT L1_chr_arr := L1_chr_arr ('Employee', 'Department Parameter');
  c_inp_field_2lis      CONSTANT L2_chr_arr := L2_chr_arr (
                                                        L1_chr_arr (
                                                                '*Employee Id',
                                                                'Last Name',
                                                                'Email',
                                                                'Hire Date',
                                                                'Job',
                                                                '*Salary',
                                                                '*Manager Id',
                                                                '*Department Id',
                                                                'Updated'),
                                                        L1_chr_arr (
                                                                '*Department Id')
  );
  c_out_field_2lis      CONSTANT L2_chr_arr :=  L2_chr_arr ( L1_chr_arr (
                                'Name',
                                'Department',
                                'Manager',
                                '*Salary',
                                '*Salary Ratio (dep)',
                                '*Salary Ratio (overall)'));

  l_act_2lis                      L2_chr_arr := L2_chr_arr();
  c_ms_limit            CONSTANT PLS_INTEGER := 1;
  l_timer_set                    PLS_INTEGER;
  l_inp_3lis                     L3_chr_arr := L3_chr_arr();
  l_emp_csr                      SYS_REFCURSOR;

Notes on declare section

  • Data sets, scenarios, expected values etc. are stored in generic arrays, where:
    • L1_chr_arr is type of array of VARCHAR2(4000), same as standard type SYS.ODCIVarchar2List
    • L2_chr_arr is a type of array of L1_chr_arr
    • L3_chr_arr is a type of array of L2_chr_arr

Setup section

  PROCEDURE Setup_Array IS
  BEGIN

    l_act_2lis.EXTEND (c_exp_2lis.COUNT);
    l_inp_3lis.EXTEND (c_exp_2lis.COUNT);

    FOR i IN 1..c_exp_2lis.COUNT LOOP

      l_inp_3lis (i) := L2_chr_arr();
      l_inp_3lis (i).EXTEND(2);
      l_inp_3lis (i)(2) := L1_chr_arr (c_dep_lis(i));

    END LOOP;

  END Setup_Array;

  /***************************************************************************************************

  Setup_DB: Create test records for a given scenario for testing AIP_Get_Dept_Emps

  ***************************************************************************************************/
  PROCEDURE Setup_DB (p_call_ind           PLS_INTEGER,   -- index of input dataset
                      x_inp_lis        OUT L1_chr_arr) IS -- input list, employees

    l_emp_id            PLS_INTEGER;
    l_mgr_id            PLS_INTEGER;
    l_len_lis           L1_num_arr := L1_num_arr (1, -11, -13, -10, 10, -10);

  BEGIN

    Utils.Heading ('Employees created in setup: DS-' || p_call_ind || ' - ' || c_dataset_3lis (p_call_ind)(1)(1));
    Utils.Col_Headers (L1_chr_arr ('#', 'Employee id', 'Department id', 'Manager', 'Job id', 'Salary'), l_len_lis);
    x_inp_lis := L1_chr_arr();
    x_inp_lis.EXTEND (c_dataset_3lis (p_call_ind).COUNT - 1);
    FOR i IN 2..c_dataset_3lis (p_call_ind).COUNT LOOP

      l_emp_id := DML_API_TT_HR.Ins_Emp (
                            p_emp_ind  => i - 1,
                            p_dep_id   => c_dataset_3lis (p_call_ind)(i)(1),
                            p_mgr_id   => l_mgr_id,
                            p_job_id   => c_dataset_3lis (p_call_ind)(i)(2),
                            p_salary   => c_dataset_3lis (p_call_ind)(i)(3),
                            x_rec      => x_inp_lis(i - 1));
      Utils.Pr_List_As_Line (L1_chr_arr ((i-1), l_emp_id, Nvl (c_dataset_3lis (p_call_ind)(i)(1), ' '), Nvl (To_Char(l_mgr_id), ' '), c_dataset_3lis (p_call_ind)(i)(2), c_dataset_3lis (p_call_ind)(i)(3)), l_len_lis);
      IF i = 2 THEN
        l_mgr_id := l_emp_id;
      END IF;

    END LOOP;

  END Setup_DB;

Notes on setup section

  • c_dataset_3lis contains the data for all data sets indexed by (data set, record, field)
  • Setup_Array is called once to do some setup on the arrays
  • Setup_DB is called for a single data set at a time in each scenario
  • Description of the data set is contained in the array and printed out
  • Data set is printed out in tabular format. In the current version of the utility code, this is not strictly necessary, because all the input data is printed out before the outputs

'Pure' API wrapper procedure

  PROCEDURE Purely_Wrap_API (p_scenario_ds      PLS_INTEGER,   -- index of input dataset
                             p_dep_id           PLS_INTEGER,   -- input department id
                             x_inp_lis      OUT L1_chr_arr,    -- generic inputs list (for scenario)
                             x_act_lis      OUT L1_chr_arr) IS -- generic actual values list (for scenario)
  BEGIN

    Setup_DB (p_scenario_ds, x_inp_lis);
    Timer_Set.Increment_Time (l_timer_set, Utils_TT.c_setup_timer);

    Emp_WS.AIP_Get_Dept_Emps (p_dep_id  => p_dep_id,
                              x_emp_csr => l_emp_csr);
    x_act_lis := Utils_TT.List_or_Empty (Utils_TT.Cursor_to_Array (x_csr => l_emp_csr));
    Timer_Set.Increment_Time (l_timer_set, Utils_TT.c_call_timer);
    ROLLBACK;

  END Purely_Wrap_API;

Notes on 'Pure' API wrapper procedure

  • Setup_DB is called to create the data set for the scenario
  • Emp_WS.AIP_Get_Dept_Emps returns the ref cursor
  • Utils_TT.Cursor_to_Array gets the ref cursor result set into an array (which may be empty)

Main section

BEGIN
--
-- Every testing main section should be similar to this, with array setup, then loop over scenarios
-- making a 'pure'(-ish) call to specific, local Purely_Wrap_API, with single assertion call outside
-- the loop
--
  l_timer_set := Utils_TT.Init (c_proc_name);
  Setup_Array;

  FOR i IN 1..c_exp_2lis.COUNT LOOP

    Purely_Wrap_API (c_scenario_ds_lis(i), c_where_lis(i), l_inp_3lis(i)(1), l_act_2lis(i));

  END LOOP;

  Utils_TT.Is_Deeply (c_proc_name, c_scenario_lis, l_inp_3lis, l_act_2lis, c_exp_2lis, l_timer_set, c_ms_limit,
                      c_inp_group_lis, c_inp_field_2lis, c_out_group_lis, c_out_field_2lis);

EXCEPTION

  WHEN OTHERS THEN
    Utils.Write_Other_Error;
    RAISE;

END tt_AIP_Get_Dept_Emps;

Notes on main section

  • It's quite short isn't it šŸ™‚
  • Setup is called to do array setup
  • Main section loops over the scenarios calling Purely_Wrap_API
  • Is_Deeply is called to do all the assertions within nested loops, then print the results

Test Output

TRAPIT TEST: TT_Emp_WS.tt_AIP_Get_Dept_Emps
===========================================

Employees created in setup: DS-1 - 4 emps, 1 dep (10), emp-3 has no dep, emp-4 has bad job
==========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1658             10              IT_PROG           1000
2         1659             10        1658  IT_PROG           2000
3         1660                       1658  IT_PROG           3000
4         1661             10        1658  AD_ASST           4000

Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1662             10              IT_PROG           1000
2         1663             10        1662  IT_PROG           2000
3         1664                       1662  IT_PROG           3000
4         1665             10        1662  AD_ASST           4000
5         1666             20        1662  IT_PROG           5000

Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1667             10              IT_PROG           1000
2         1668             10        1667  IT_PROG           2000
3         1669                       1667  IT_PROG           3000
4         1670             10        1667  AD_ASST           4000
5         1671             20        1667  IT_PROG           5000

Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1672             10              IT_PROG           1000
2         1673             10        1672  IT_PROG           2000
3         1674                       1672  IT_PROG           3000
4         1675             10        1672  AD_ASST           4000
5         1676             20        1672  IT_PROG           5000

Employees created in setup: DS-3 - As dataset 2 but with salaries * 0.1, total below reporting threshold of 1600
================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1677             10              IT_PROG            100
2         1678             10        1677  IT_PROG            200
3         1679                       1677  IT_PROG            300
4         1680             10        1677  AD_ASST            400
5         1681             20        1677  IT_PROG            500

SCENARIO 1: DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep (10) - pass dep 10 {
==================================================================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date    Job      Salary  Manager Id  Department Id  Updated
            -----------  ---------  -----  -----------  -------  ------  ----------  -------------  -----------
                   1658  LN_1       EM_1   01-OCT-2016  IT_PROG    1000                         10  01-OCT-2016
                   1659  LN_2       EM_2   01-OCT-2016  IT_PROG    2000        1658             10  01-OCT-2016
                   1660  LN_3       EM_3   01-OCT-2016  IT_PROG    3000        1658                 01-OCT-2016
                   1661  LN_4       EM_4   01-OCT-2016  AD_ASST    4000        1658             10  01-OCT-2016

        }
        =

        GROUP Department Parameter {
        ============================

            Department Id
            -------------
                       10

        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 2, Expected = 2 {
        ================================================

            F?  Name  Department      Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  --------------  -------  ------  ------------------  ----------------------
                LN_1  Administration             1000                 .67                      .4
                LN_2  Administration  LN_1       2000                1.33                      .8

        } 0 failed, of 2: SUCCESS
        =========================

} 0 failed, of 2: SUCCESS
=========================

SCENARIO 2: DS-2, testing same as 1 but with extra emp in another dep (20) - pass dep 10 {
==========================================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date    Job      Salary  Manager Id  Department Id  Updated
            -----------  ---------  -----  -----------  -------  ------  ----------  -------------  -----------
                   1662  LN_1       EM_1   01-OCT-2016  IT_PROG    1000                         10  01-OCT-2016
                   1663  LN_2       EM_2   01-OCT-2016  IT_PROG    2000        1662             10  01-OCT-2016
                   1664  LN_3       EM_3   01-OCT-2016  IT_PROG    3000        1662                 01-OCT-2016
                   1665  LN_4       EM_4   01-OCT-2016  AD_ASST    4000        1662             10  01-OCT-2016
                   1666  LN_5       EM_5   01-OCT-2016  IT_PROG    5000        1662             20  01-OCT-2016

        }
        =

        GROUP Department Parameter {
        ============================

            Department Id
            -------------
                       10

        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 2, Expected = 2 {
        ================================================

            F?  Name  Department      Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  --------------  -------  ------  ------------------  ----------------------
                LN_1  Administration             1000                 .67                     .33
                LN_2  Administration  LN_1       2000                1.33                     .67

        } 0 failed, of 2: SUCCESS
        =========================

} 0 failed, of 2: SUCCESS
=========================

SCENARIO 3: DS-2, as second scenario, but - pass dep 20 {
=========================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date    Job      Salary  Manager Id  Department Id  Updated
            -----------  ---------  -----  -----------  -------  ------  ----------  -------------  -----------
                   1667  LN_1       EM_1   01-OCT-2016  IT_PROG    1000                         10  01-OCT-2016
                   1668  LN_2       EM_2   01-OCT-2016  IT_PROG    2000        1667             10  01-OCT-2016
                   1669  LN_3       EM_3   01-OCT-2016  IT_PROG    3000        1667                 01-OCT-2016
                   1670  LN_4       EM_4   01-OCT-2016  AD_ASST    4000        1667             10  01-OCT-2016
                   1671  LN_5       EM_5   01-OCT-2016  IT_PROG    5000        1667             20  01-OCT-2016

        }
        =

        GROUP Department Parameter {
        ============================

            Department Id
            -------------
                       20

        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 1, Expected = 1 {
        ================================================

            F?  Name  Department  Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  ----------  -------  ------  ------------------  ----------------------
                LN_5  Marketing   LN_1       5000                   1                    1.67

        } 0 failed, of 1: SUCCESS
        =========================

} 0 failed, of 1: SUCCESS
=========================

SCENARIO 4: DS-2, as second scenario, but - pass null dep {
===========================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date    Job      Salary  Manager Id  Department Id  Updated
            -----------  ---------  -----  -----------  -------  ------  ----------  -------------  -----------
                   1672  LN_1       EM_1   01-OCT-2016  IT_PROG    1000                         10  01-OCT-2016
                   1673  LN_2       EM_2   01-OCT-2016  IT_PROG    2000        1672             10  01-OCT-2016
                   1674  LN_3       EM_3   01-OCT-2016  IT_PROG    3000        1672                 01-OCT-2016
                   1675  LN_4       EM_4   01-OCT-2016  AD_ASST    4000        1672             10  01-OCT-2016
                   1676  LN_5       EM_5   01-OCT-2016  IT_PROG    5000        1672             20  01-OCT-2016

        }
        =

        GROUP Department Parameter {
        ============================

            Department Id
            -------------


        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 0, Expected = 0: SUCCESS
        =======================================================

} 0 failed, of 1: SUCCESS
=========================

SCENARIO 5: DS-3, Salaries total 1500 (< threshold of 1600, so return nothing) - pass dep 10 {
==============================================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date    Job      Salary  Manager Id  Department Id  Updated
            -----------  ---------  -----  -----------  -------  ------  ----------  -------------  -----------
                   1677  LN_1       EM_1   01-OCT-2016  IT_PROG     100                         10  01-OCT-2016
                   1678  LN_2       EM_2   01-OCT-2016  IT_PROG     200        1677             10  01-OCT-2016
                   1679  LN_3       EM_3   01-OCT-2016  IT_PROG     300        1677                 01-OCT-2016
                   1680  LN_4       EM_4   01-OCT-2016  AD_ASST     400        1677             10  01-OCT-2016
                   1681  LN_5       EM_5   01-OCT-2016  IT_PROG     500        1677             20  01-OCT-2016

        }
        =

        GROUP Department Parameter {
        ============================

            Department Id
            -------------
                       10

        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 0, Expected = 0: SUCCESS
        =======================================================

} 0 failed, of 1: SUCCESS
=========================

TIMING: Actual = 7, Expected <= 1: FAILURE
==========================================

SUMMARY for TT_Emp_WS.tt_AIP_Get_Dept_Emps
==========================================

Scenario                                                                                              # Failed  # Tests  Status
----------------------------------------------------------------------------------------------------  --------  -------  -------
DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep (10) - pass dep 10         0        2  SUCCESS
DS-2, testing same as 1 but with extra emp in another dep (20) - pass dep 10                                 0        2  SUCCESS
DS-2, as second scenario, but - pass dep 20                                                                  0        1  SUCCESS
DS-2, as second scenario, but - pass null dep                                                                0        1  SUCCESS
DS-3, Salaries total 1500 (< threshold of 1600, so return nothing) - pass dep 10                             0        1  SUCCESS
Timing                                                                                                       1        1  FAILURE
----------------------------------------------------------------------------------------------------  --------  -------  -------
Total                                                                                                        1        8  FAILURE
----------------------------------------------------------------------------------------------------  --------  -------  -------

Timer Set: TT_Emp_WS.tt_AIP_Get_Dept_Emps, Constructed at 01 Oct 2016 09:14:12, written at 09:14:13
===================================================================================================
[Timer timed: Elapsed (per call): 0.03 (0.000034), CPU (per call): 0.03 (0.000030), calls: 1000, '***' denotes corrected line below]

Timer       Elapsed         CPU         Calls       Ela/Call       CPU/Call
-------  ----------  ----------  ------------  -------------  -------------
Setup          0.08        0.05             5        0.01660        0.01000
Caller         0.03        0.01             5        0.00680        0.00200
(Other)        0.10        0.10             1        0.09500        0.10000
-------  ----------  ----------  ------------  -------------  -------------
Total          0.21        0.16            11        0.01927        0.01455
-------  ----------  ----------  ------------  -------------  -------------






 

Design Patterns for Database API Testing 2: Views 1 - Design

Last October I gave a presentation on database unit testing with utPLSQL, Oracle Unit Testing with utPLSQL. I mentioned design patterns as a way of reducing the effort of building unit tests and outlined some strategies for coding them effectively.

In the current set of articles, I develop the ideas further, starting from the idea that all database APIs can be considered in terms of the axes:

  • direction (i.e. getter or setter, noting that setters can also 'get')
  • mode (i.e. real time or batch)

For each cell in the implied matrix, I construct an example API (or view) with specified requirements against Oracle's HR demo schema, and use this example to construct a testing program with appropriate scenarios as a design pattern. Concepts and common patterns and anti-patterns in automated API testing are discussed throughout, and these are largely independent of testing framework used. However, the examples use my own lightweight independent framework that is designed to help avoid many API testing anti-patterns. The code is available on GitHub here, BrenPatF/trapit_oracle_tester, and includes both framework and design pattern examples.

Behind the four examples, there is an underlying design pattern that involves wrapping the API call in a 'pure' procedure, called once per scenario, with the output 'actuals' array including everything affected by the API, whether as output parameters, or on database tables, etc. The inputs are also extended from the API parameters to include any other effective inputs. Assertion takes place after all scenarios and is against the extended outputs, with extended inputs also listed. This concept of the 'pure' function, central to Functional Programming, has important advantages in automated testing. I explained the concepts involved in a presentation at the Oracle User Group Ireland Conference in March 2018:

The Database API Viewed As A Mathematical Function: Insights into Testing


In this 2-part article, I present a design pattern for testing views. I start by discussing when and how to test views. Unlike in the design paatern of the first article in the series, Design Patterns for Database API Testing 1: Web Service Saving - Design, test data has to be created during testing of views, and a very general approach to creating and selecting the test data is proposed. The use case for the design pattern is described, and scenarios and sub-scenarios are defined conceptually. Finally, the output from the testing is presented, with notes.

The second post provides some code extracts, with notes: Design Patterns for Database API Testing 2: Views 2 - Code.

When to Test Views

Views can be simple or complex, or, as I categorise them in Brendan's 2-Page Oracle Programming Standards, thin or thick, where thick views include table joins and thinviews don't. Thin views do not normally require testing while it may or may not be appropriate to test thick views.

As explained in the second part of the first article mentioned above, method-based testing is a bad idea, and occurs when the test suite is based on testing all the methods in a package, rather than units of (external) behaviour (often corresponding to procedures prefixed AIP in a common naming convention). Similarly, we can consider views in the same way as methods and ask whether they represent testable units of behaviour or are merely internal code structures, which should not normally have individual automated tests for the reasons given there.

Good examples of views that should be tested would be those that form the basis of complex data extraction to file, by ETL tools such as Informatica, or those that form the basis of reporting tools such as Business Objects. In fact, it is very good practice to place SQL for these tools into views precisely so that they can be tested.

How to Test Views Using a PL/SQL Testing Framework

In order to leverage a PL/SQL API testing framework to also test views, the API test package procedures call a library procedure passing the name of the relevant view: The library procedure returns the result of querying the view as an array of delimited strings, and the API test procedures then compare the results against their own expected results.

Each API test procedure will have its own setup local procedure to create test data, and we need to discuss the issue of distinguishing test data from pre-existing data.

Test Data

In the earlier article on database save procedures, we did not create any test data within the testing code itself, but the base procedure did create data, and those were queried back for assertion. In order to select only the data created by the procedure call a prefix was used in one of the string fields which was assumed not to exist already. This is a workable approach in some cases, but may not be possible in general. Let us consider the different types of database data that may affect our testing:

  • Data created by the base code being tested
  • Data created by test code to be read by the base code
  • Data not created by test code to be read by base code

In order to verify that the program calls are giving results as expected, the test code needs to know all the data that influence the results, not necessarily just directly created data. Our view testing use case described below has an example where the results depend on an aggregate of all the records on the database. This is a problem when we have a shared database, where we cannot freeze the data at the time of test development. In order to handle this problem, we propose to borrow a technique used in Oracle's ebusinees applications.

Partitioning Views with System Contexts
In Oracle ebusiness's multi-org implementations, transactions are partitioned by a numeric identifier for the organization owning the transaction. This org_id value is stored in a column in the base table on transaction creation. Within the application code the base table is not queried directly, but through a view that restricts records returned to those for the organization corresponding to the current role of the application user, which is stored in the userenv system context (this is true up to release 11.5, but the mechanism changed in release 12.1).

See SYS_CONTEXT for information on the system context database feature, and Oracle E-Business Suite Multiple Organizations Implementation Guide (12.1) for release 12.1 multi-org implementation in Oracle ebusiness.

Partitioning Views for Testing
We propose to use views in a similar way to the multi-org views, to restrict records to those created in the testing session, by means of a ttid column on the base table that will hold the session id. The new optional column is added to those tables where this approach is required, and view are created on the tables. Our testing utility package Utils_TT sets a context variable to the value 'TT' to signify testing mode, and the session id is set to a package variable in the general utilities package Utils.

Any base code that inserts data into the tables has to check for test mode, and if set, put the session id into the ttid field, and if not, leave it blank. The views use the following clause:

 WHERE (ttid = SYS_Context ('userenv', 'sessionid') OR 
        Substr (Nvl (SYS_Context ('userenv', 'client_info'), 'XX'), 1, 2) != 'TT')

Both test code and base code now query the views instead of the base tables. As the base code to write to the tables has to account for the new column, it is necessary for the column to be added in all instances including production. If this seems a little drastic, consider the importance that you attach to testing, and bear in mind that the earlier, less general, approaches may suffice in many cases. In these design pattern demos I use the general solution.

Schema Structure

In the earlier articles, the base code and test packages were created in the HR schema, with utility packages kept in the custom brendan schema. However, it is more common to use separate schemas for code and data, so we will now place all packages and supporting objects in the brendan schema, and create the testing views there.

Design Pattern Use Case for Testing Views

Modern Oracle SQL is very powerful and can apply complex logic within a single statement, reducing the need for more complex procedural code. In order to show how to test SQL, we will devise a test view, HR_Test_V, having a range of features that we might want to test in general:

  • Inner joins suppress driving records where there is no joining record
  • Outer joins return driving records where there is no joining record
  • Analytic functions that partition by some key, and return aggregates on the returned record set
  • Functions based on aggregates over records that include those not in the returned record set
  • Constraints based on aggregates over records that include those not in the returned record set
  • Constraints on column values

The view functionality can be described in words as:

  • Selected values
    • Employee name, department name, and salary
    • Manager's name
    • Ratio of employee's salary to the department average (returned employees only)
    • Ratio of employee's salary to the average salary of all employees
  • Constraints
    • Exclude employees in job 'AD_ASST'
    • Exclude employees without a department
    • Do not return any records if the total salary of all employees is below 1600
  • Outer join
    • Include employees both with and without a manager

The view SQL is:

CREATE OR REPLACE VIEW hr_test_view_v AS
WITH all_emps AS (
        SELECT Avg (salary) avg_sal, SUM (salary) sal_tot_g
          FROM employees e
)
SELECT e.last_name, d.department_name, m.last_name manager, e.salary,
       Round (e.salary / Avg (e.salary) OVER (PARTITION BY e.department_id), 2) sal_rat,
       Round (e.salary / a.avg_sal, 2) sal_rat_g
  FROM all_emps a
 CROSS JOIN employees e
  JOIN departments d
    ON d.department_id = e.department_id
  LEFT JOIN employees m
    ON m.employee_id = e.manager_id
 WHERE e.job_id != 'AD_ASST'
   AND a.sal_tot_g >= 1600

Scenarios and Sub-scenarios

Scenario definition
Following our earlier article, we may define a scenario as being the set of all relevant records, both on the database and passed as parameters, to a single program call. API or view testing involves creating one or more scenarios, calling the program (or executing the process) for each scenario, and verifying that the output records are as expected.

Good testing is achieved when the scenarios are chosen to validate as wide a range of behaviours as possible. It is not always, or usually, necessary to create a new scenario for each aspect of behaviour to be tested.

Sub-scenario definition
Often, several features can be tested in the same program call by setting up different records in the scenario that will independently test the different features. For example, in our use case above we can create employees with and without a department, and with and without a manager in the same scenario to test the different types of join.

It may be helpful to think of these separate records, or fields within a record, as corresponding to sub-scenarios, and try to construct scenarios as efficiently as possible without making more calls than necessary.

View Test Output

Data setup section

SCENARIO 1: DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep, Employees created in setup: DS-1 - 4 emps, 1 dep (10), emp-3 has no dep, emp-4 has bad job
=========================================================================================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1493             10              IT_PROG           1000
2         1494             10        1493  IT_PROG           2000
3         1495                       1493  IT_PROG           3000
4         1496             10        1493  AD_ASST           4000

SCENARIO 2: DS-2, testing same as 1 but with extra emp in another dep, Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
===============================================================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1497             10              IT_PROG           1000
2         1498             10        1497  IT_PROG           2000
3         1499                       1497  IT_PROG           3000
4         1500             10        1497  AD_ASST           4000
5         1501             20        1497  IT_PROG           5000

SCENARIO 3: DS-2, passing 'WHERE dep=10', Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
==================================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1502             10              IT_PROG           1000
2         1503             10        1502  IT_PROG           2000
3         1504                       1502  IT_PROG           3000
4         1505             10        1502  AD_ASST           4000
5         1506             20        1502  IT_PROG           5000

SCENARIO 4: DS-3, Salaries total 1500 (< threshold of 1600), Employees created in setup: DS-3 - As dataset 2 but with salaries * 0.1, total below reporting threshold of 1600
=============================================================================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1507             10              IT_PROG            100
2         1508             10        1507  IT_PROG            200
3         1509                       1507  IT_PROG            300
4         1510             10        1507  AD_ASST            400
5         1511             20        1507  IT_PROG            500

Notes on data setup section

  • There are three data sets, and four scenarios, each of which references a data set
  • The call to set up the data for a scenario writes out all the data created
  • A header provides a description of the features (or sub-scenarios) in the data set
  • In the output above scenarios 2 and 3 use the same data set, DS-2

Results section

SQL>BEGIN

  Utils.Clear_Log;
  Utils_TT.Run_Suite (Utils_TT.c_tt_suite_bren);

EXCEPTION
  WHEN OTHERS THEN
    Utils.Write_Other_Error;
END;
/
SQL> @L_Log_Default

TEXT
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------


TRAPIT TEST: TT_View_Drivers.tt_HR_Test_View_V
==============================================

Employees created in setup: DS-1 - 4 emps, 1 dep (10), emp-3 has no dep, emp-4 has bad job
==========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1518             10              IT_PROG           1000
2         1519             10        1518  IT_PROG           2000
3         1520                       1518  IT_PROG           3000
4         1521             10        1518  AD_ASST           4000

Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1522             10              IT_PROG           1000
2         1523             10        1522  IT_PROG           2000
3         1524                       1522  IT_PROG           3000
4         1525             10        1522  AD_ASST           4000
5         1526             20        1522  IT_PROG           5000

Employees created in setup: DS-2 - As dataset 1 but with extra emp-5, in second dep (20)
========================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1527             10              IT_PROG           1000
2         1528             10        1527  IT_PROG           2000
3         1529                       1527  IT_PROG           3000
4         1530             10        1527  AD_ASST           4000
5         1531             20        1527  IT_PROG           5000

Employees created in setup: DS-3 - As dataset 2 but with salaries * 0.1, total below reporting threshold of 1600
================================================================================================================

#  Employee id  Department id     Manager  Job id          Salary
-  -----------  -------------  ----------  ----------  ----------
1         1532             10              IT_PROG            100
2         1533             10        1532  IT_PROG            200
3         1534                       1532  IT_PROG            300
4         1535             10        1532  AD_ASST            400
5         1536             20        1532  IT_PROG            500

SCENARIO 1: DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep {
===============================================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date  Job      Salary  Manager Id  department Id
            -----------  ---------  -----  ---------  -------  ------  ----------  -------------
                   1518  LN_1       EM_1   09-JUL-16  IT_PROG    1000                         10
                   1519  LN_2       EM_2   09-JUL-16  IT_PROG    2000        1518             10
                   1520  LN_3       EM_3   09-JUL-16  IT_PROG    3000        1518
                   1521  LN_4       EM_4   09-JUL-16  AD_ASST    4000        1518             10

        }
        =

        GROUP Where {
        =============

            Where
            -----


        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 2, Expected = 2 {
        ================================================

            F?  Name  Department      Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  --------------  -------  ------  ------------------  ----------------------
                LN_1  Administration             1000                 .67                      .4
                LN_2  Administration  LN_1       2000                1.33                      .8

        } 0 failed, of 2: SUCCESS
        =========================

} 0 failed, of 2: SUCCESS
=========================

SCENARIO 2: DS-2, testing same as 1 but with extra emp in another dep {
=======================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date  Job      Salary  Manager Id  department Id
            -----------  ---------  -----  ---------  -------  ------  ----------  -------------
                   1522  LN_1       EM_1   09-JUL-16  IT_PROG    1000                         10
                   1523  LN_2       EM_2   09-JUL-16  IT_PROG    2000        1522             10
                   1524  LN_3       EM_3   09-JUL-16  IT_PROG    3000        1522
                   1525  LN_4       EM_4   09-JUL-16  AD_ASST    4000        1522             10
                   1526  LN_5       EM_5   09-JUL-16  IT_PROG    5000        1522             20

        }
        =

        GROUP Where {
        =============

            Where
            -----


        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 3, Expected = 3 {
        ================================================

            F?  Name  Department      Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  --------------  -------  ------  ------------------  ----------------------
                LN_1  Administration             1000                 .67                     .33
                LN_2  Administration  LN_1       2000                1.33                     .67
                LN_5  Marketing       LN_1       5000                   1                    1.67

        } 0 failed, of 3: SUCCESS
        =========================

} 0 failed, of 3: SUCCESS
=========================

SCENARIO 3: DS-2, passing 'WHERE dep=10' {
==========================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date  Job      Salary  Manager Id  department Id
            -----------  ---------  -----  ---------  -------  ------  ----------  -------------
                   1527  LN_1       EM_1   09-JUL-16  IT_PROG    1000                         10
                   1528  LN_2       EM_2   09-JUL-16  IT_PROG    2000        1527             10
                   1529  LN_3       EM_3   09-JUL-16  IT_PROG    3000        1527
                   1530  LN_4       EM_4   09-JUL-16  AD_ASST    4000        1527             10
                   1531  LN_5       EM_5   09-JUL-16  IT_PROG    5000        1527             20

        }
        =

        GROUP Where {
        =============

            Where
            --------------------------------
            department_name='Administration'

        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 2, Expected = 2 {
        ================================================

            F?  Name  Department      Manager  Salary  Salary Ratio (dep)  Salary Ratio (overall)
            --  ----  --------------  -------  ------  ------------------  ----------------------
                LN_1  Administration             1000                 .67                     .33
                LN_2  Administration  LN_1       2000                1.33                     .67

        } 0 failed, of 2: SUCCESS
        =========================

} 0 failed, of 2: SUCCESS
=========================

SCENARIO 4: DS-3, Salaries total 1500 (< threshold of 1600) {
=============================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Employee Id  Last Name  Email  Hire Date  Job      Salary  Manager Id  department Id
            -----------  ---------  -----  ---------  -------  ------  ----------  -------------
                   1532  LN_1       EM_1   09-JUL-16  IT_PROG     100                         10
                   1533  LN_2       EM_2   09-JUL-16  IT_PROG     200        1532             10
                   1534  LN_3       EM_3   09-JUL-16  IT_PROG     300        1532
                   1535  LN_4       EM_4   09-JUL-16  AD_ASST     400        1532             10
                   1536  LN_5       EM_5   09-JUL-16  IT_PROG     500        1532             20

        }
        =

        GROUP Where {
        =============

            Where
            -----


        }
        =

    OUTPUTS
    =======

        GROUP Select results: Actual = 0, Expected = 0: SUCCESS
        =======================================================

} 0 failed, of 1: SUCCESS
=========================

TIMING: Actual = 48, Expected <= 1: FAILURE
===========================================

SUMMARY for TT_View_Drivers.tt_HR_Test_View_V
=============================================

Scenario                                                                           # Failed  # Tests  Status
---------------------------------------------------------------------------------  --------  -------  -------
DS-1, testing inner, outer joins, analytic over dep, and global ratios with 1 dep         0        2  SUCCESS
DS-2, testing same as 1 but with extra emp in another dep                                 0        3  SUCCESS
DS-2, passing 'WHERE dep=10'                                                              0        2  SUCCESS
DS-3, Salaries total 1500 (< threshold of 1600)                                           0        1  SUCCESS
Timing                                                                                    1        1  FAILURE
---------------------------------------------------------------------------------  --------  -------  -------
Total                                                                                     1        9  FAILURE
---------------------------------------------------------------------------------  --------  -------  -------

Timer Set: TT_View_Drivers.tt_HR_Test_View_V, Constructed at 09 Jul 2016 13:32:42, written at 13:32:42
======================================================================================================
[Timer timed: Elapsed (per call): 0.01 (0.000013), CPU (per call): 0.02 (0.000020), calls: 1000, '***' denotes corrected line below]

Timer       Elapsed         CPU         Calls       Ela/Call       CPU/Call
-------  ----------  ----------  ------------  -------------  -------------
Setup          0.11        0.00             4        0.02675        0.00000
Caller         0.19        0.06             4        0.04750        0.01500
(Other)        0.03        0.03             1        0.02700        0.03000
-------  ----------  ----------  ------------  -------------  -------------
Total          0.32        0.09             9        0.03600        0.01000
-------  ----------  ----------  ------------  -------------  -------------

Notes on results section

  • In a view test there is only one group, namely the selected data set

The second part of the article is here: Design Patterns for Database Unit Testing 2: Views 2 - Code






 

A Template Script for JDBC Calling of Oracle Procedures with Object Array Parameters

Some time ago I wrote an Oracle database package for a web service. The Java developer for the service told me that it was throwing an error when called from Java, although I had unit tested it from PL/SQL. He gave me a small Java driver script to demonstrate the issue, and this allowed the issue to be quickly identified: As both Java and PL/SQL have boolean data types I had considered that a boolean parameter would make sense to pass a boolean value. However, it turns out that this does not work in JDBC, and so I replaced it with an integer parameter.

It occurred to me then that it would be nice if the database developer was able in general to test JDBC compatibility of his or her procedure as a final step after unit testing. To this end I created a more generic example script based on a simple procedure that I wrote against Oracle's HR demo schema, the same procedure that I used as an example of a unit testing design pattern Design Patterns for Database Unit Testing 2: Web Service Saving - Code

Update, 4 November 2017: I have made a self-contained project on GitHub with both Java and Oracle code, avoiding dependency on my testing project. JDBC Calling of Oracle Procedures with Object Array Parameters on GitHub. I have also added the Oracle code below.

The code below runs against any Oracle instance in which the standard Oracle HR demo schema has been installed. There is a video demonstration of how to install and run it at the end of this article. The procedure has one input and one output object array parameter, and can easily be extended as desired. It requires one jar file in the classpath, ojdbc6.jar, which is available in Oracle client or database installations, and can be run from an IDE such as Eclipse.

Java Code

package jdbcdemo;
/***************************************************************************************************
Name:        Driver.java
Description: This is a Java driver script for Brendan's HR demo web service procedure. It is
             designed to serve as a template for other web service procedures to allow a database
             developer to do a JDBC integration test easily, and can also be used as a starting point
             for Java development.

             The template procedure takes an input array of objects and has an output array of 
             objects. It is easy to update for any named object and array types, procedure and
             Oracle connection. Any other signature types would need additional changes.

	     See 'A Template Script for JDBC Calling of Oracle Procedures with Object Array Parameters'
             
A Template Script for JDBC Calling of Oracle Procedures with Object Array Parameters
Modification History Who When Which What -------------------- ----------- ----- ------------------------------------------------------------- Brendan Furey 14-May-2016 1.0 Created Brendan Furey 04-Nov-2017 1.1 Put into new GitHub project along with Oracle code ***************************************************************************************************/ import java.sql.DriverManager; import java.sql.SQLException; import java.sql.Array; import java.sql.Struct; import oracle.jdbc.OracleTypes; import oracle.jdbc.OracleCallableStatement; import oracle.jdbc.OracleConnection; public class Driver { // Change section 1/2: Replace these constants with your own values private static final String DB_CONNECTION = "jdbc:oracle:thin:hr/hr@localhost:1521/orclpdb"; private static final String TY_IN_OBJ = "TY_EMP_IN_OBJ"; private static final String TY_IN_ARR = "TY_EMP_IN_ARR"; private static final String TY_OUT_ARR = "TY_EMP_OUT_ARR"; private static final String PROC_NAME = "Emp_WS.AIP_Save_Emps"; private static OracleConnection conn; public static void main(String[] argv) { try { getDBConnection (); prOutArray (callProc (inArray ())); } catch (SQLException e) { System.out.println(e.getMessage()); } } private static Array inArray () throws SQLException { // Change section 2/2: Replace [2] with number of test records, and the arrays with their values Struct[] struct = new Struct[2]; struct[0] = conn.createStruct (TY_IN_OBJ, new Object[] {"LN 1", "EM 1", "IT_PROG", 1000}); struct[1] = conn.createStruct (TY_IN_OBJ, new Object[] {"LN 2", "EM 2", "IT_PROG", 2000}); return conn.createARRAY (TY_IN_ARR, struct); } private static Array callProc (Array objArray) throws SQLException { OracleCallableStatement ocs = (OracleCallableStatement) conn.prepareCall ("BEGIN "+PROC_NAME+"(:1, :2); END;"); ocs.setArray (1, objArray); ocs.registerOutParameter (2, OracleTypes.ARRAY, TY_OUT_ARR); ocs.execute (); return ocs.getARRAY (2); } private static void prOutArray (Array arr) throws SQLException { Object[] objArr = (Object[]) arr.getArray(); int j = 0; for (Object rec : objArr) { Object[] objLis = ((Struct)rec).getAttributes (); int i = 0; String recStr = ""; for (Object fld : objLis) { if (i++ > 0) recStr = recStr + '/'; recStr = recStr + fld.toString(); } System.out.println ("Record "+(++j)+": "+recStr); } } private static void getDBConnection () throws SQLException { conn = (OracleConnection) DriverManager.getConnection (DB_CONNECTION); conn.setAutoCommit (false); System.out.println ("Connected..."); } }

Example output

Connected...
Record 1: 239/TWO HUNDRED THIRTY-NINE
Record 2: 240/TWO HUNDRED FORTY

Google Java Style

Oracle Code

/***************************************************************************************************

Author:      Brendan Furey
Description: Script to create objects to demo JDBC procedure calls with object array parameters

         See 'A Template Script for JDBC Calling of Oracle Procedures with Object Array Parameters'
             
A Template Script for JDBC Calling of Oracle Procedures with Object Array Parameters
Modification History Who When Which What -------------------- ----------- ----- ------------------------------------------------------------- Brendan Furey 04-May-2016 1.0 Created Brendan Furey 04-Nov-2017 1.1 Extracted the JDBC demo code from the unit testing project, and put into new GitHub project along with Java code ***************************************************************************************************/ REM Run this script from Oracle's standard HR schema to create objects to demo JDBC procedure calls COLUMN "Database" FORMAT A20 COLUMN "Time" FORMAT A20 COLUMN "Version" FORMAT A30 COLUMN "Session" FORMAT 9999990 COLUMN "OS User" FORMAT A10 COLUMN "Machine" FORMAT A20 SELECT 'Start: ' || dbs.name "Database", To_Char (SYSDATE,'DD-MON-YYYY HH24:MI:SS') "Time", Replace (Substr(ver.banner, 1, Instr(ver.banner, '64')-4), 'Enterprise Edition Release ', '') "Version" FROM v$database dbs, v$version ver WHERE ver.banner LIKE 'Oracle%'; PROMPT Input types creation DROP TYPE ty_emp_in_arr / CREATE OR REPLACE TYPE ty_emp_in_obj AS OBJECT ( last_name VARCHAR2(25), email VARCHAR2(25), job_id VARCHAR2(10), salary NUMBER ) / CREATE TYPE ty_emp_in_arr AS TABLE OF ty_emp_in_obj / PROMPT Output types creation DROP TYPE ty_emp_out_arr / CREATE OR REPLACE TYPE ty_emp_out_obj AS OBJECT ( employee_id NUMBER, description VARCHAR2(500) ) / CREATE TYPE ty_emp_out_arr AS TABLE OF ty_emp_out_obj / CREATE OR REPLACE PACKAGE Emp_WS AS /*************************************************************************************************** Description: HR demo web service code. Procedure saves new employees list and returns primary key plus same in words, or zero plus error message in output list ***************************************************************************************************/ PROCEDURE AIP_Save_Emps (p_ty_emp_in_lis ty_emp_in_arr, x_ty_emp_out_lis OUT ty_emp_out_arr); END Emp_WS; / CREATE OR REPLACE PACKAGE BODY Emp_WS AS PROCEDURE AIP_Save_Emps (p_ty_emp_in_lis ty_emp_in_arr, -- list of employees to insert x_ty_emp_out_lis OUT ty_emp_out_arr) IS -- list of employee results l_ty_emp_out_lis ty_emp_out_arr; bulk_errors EXCEPTION; PRAGMA EXCEPTION_INIT (bulk_errors, -24381); n_err PLS_INTEGER := 0; BEGIN FORALL i IN 1..p_ty_emp_in_lis.COUNT SAVE EXCEPTIONS INSERT INTO employees ( employee_id, last_name, email, hire_date, job_id, salary ) VALUES ( employees_seq.NEXTVAL, p_ty_emp_in_lis(i).last_name, p_ty_emp_in_lis(i).email, SYSDATE, p_ty_emp_in_lis(i).job_id, p_ty_emp_in_lis(i).salary ) RETURNING ty_emp_out_obj (employee_id, To_Char(To_Date(employee_id,'J'),'JSP')) BULK COLLECT INTO x_ty_emp_out_lis; EXCEPTION WHEN bulk_errors THEN l_ty_emp_out_lis := x_ty_emp_out_lis; FOR i IN 1 .. sql%BULK_EXCEPTIONS.COUNT LOOP IF i > x_ty_emp_out_lis.COUNT THEN x_ty_emp_out_lis.Extend; END IF; x_ty_emp_out_lis (SQL%Bulk_Exceptions (i).Error_Index) := ty_emp_out_obj (0, SQLERRM (- (SQL%Bulk_Exceptions (i).Error_Code))); END LOOP; FOR i IN 1..p_ty_emp_in_lis.COUNT LOOP IF i > x_ty_emp_out_lis.COUNT THEN x_ty_emp_out_lis.Extend; END IF; IF x_ty_emp_out_lis(i).employee_id = 0 THEN n_err := n_err + 1; ELSE x_ty_emp_out_lis(i) := l_ty_emp_out_lis(i - n_err); END IF; END LOOP; END AIP_Save_Emps; END Emp_WS; /

Here is a demo of installing and running the code, Oracle and Java:






 

Design Patterns for Database API Testing 1: Web Service Saving 1 - Design

Last October I gave a presentation on database unit testing with utPLSQL, Oracle Unit Testing with utPLSQL. I mentioned design patterns as a way of reducing the effort of building unit tests and outlined some strategies for coding them effectively.

In the current set of articles, I develop the ideas further, starting from the idea that all database APIs can be considered in terms of the axes:

  • direction (i.e. getter or setter, noting that setters can also 'get')
  • mode (i.e. real time or batch)

For each cell in the implied matrix, I construct an example API (or view) with specified requirements against Oracle's HR demo schema, and use this example to construct a testing program with appropriate scenarios as a design pattern. Concepts and common patterns and anti-patterns in automated API testing are discussed throughout, and these are largely independent of testing framework used. However, the examples use my own lightweight independent framework that is designed to help avoid many API testing anti-patterns. The code is available on GitHub here, BrenPatF/db_unit_test, and includes both framework and design pattern examples.

Behind the four examples, there is an underlying design pattern that involves wrapping the API call in a 'pure' procedure, called once per scenario, with the output 'actuals' array including everything affected by the API, whether as output parameters, or on database tables, etc. The inputs are also extended from the API parameters to include any other effective inputs. Assertion takes place after all scenarios and is against the extended outputs, with extended inputs also listed. This concept of the 'pure' function, central to Functional Programming, has important advantages in automated testing. I explained the concepts involved in a presentation at the Oracle User Group Ireland Conference in March 2018:

The Database API Viewed As A Mathematical Function: Insights into Testing


In this first example, I present a design pattern for web service 'save' procedures by means of a conceptual discussion, together with a working example of base code and test code for a procedure to save new employees in Oracle's well-known HR demonstration schema.

Design patterns involve abstraction and conceptual separation of general features of a situation from the particular. Therefore we will start with a fairly abstract discussion of automated API testing for the database, before proceeding to discuss the use case in question, describe the test cases, and show the results. The code isĀ presented in another article,Ā Design Patterns for Database API Testing 1: Web Service Saving 2 - Code. That article lists various extremely common antipatterns in database unit testing, and how they may be avoided. The code itself centralises as much as possible in order to make specific test code as small as possible, and is structured very differently from most unit testing code that I have seen.

This is the first in a series of four design patterns, with the second described again in two parts, starting here, Design Patterns for Database API Testing 2: Views 1 - Design. For an overview of the testing framework see TRAPIT - TRansactional API Testing in Oracle.

General Discussion of Database Unit Testing

The underlying functionality for unit testing could be described logically as:

  • Given a list of test inputs, X and a list of expected outputs, E, for function F:
  • For each x in X, with e in E:
    • Apply y = F(x)
    • Assert y = e

As the Functional Programming community knows well, functions having well-defined parameter inputs, returning values as outputs, and with no 'side-effects', are the easiest to test reliably. The difficulty with database unit testing is that most use cases do not fall into that category; instead, database procedures can read from and write to the database as well as using input and output parameters. This means that theoretically the inputs and outputs could include the whole database (at least); furthermore the database is a shared resource, so other parties can alter the data we are dealing with. One important consequence of these facts is that much of the thinking on best practices for unit testing, coming as it does from the non-database world, is not applicable here. So what to do?

Pragmatic testing

To make progress, we note that our purpose with unit testing is not to formally prove that our programs work, but rather includes the following aims:

  • Improve code quality within a Test Driven Development (TDD) approach
  • Provide regression tests to allow safe code-refactoring
  • Detect quickly changes external to the code that cause it to fail, such as reference data changes

That being so, we can note the following guidelines:

  • Testing code is written, as part of TDD, by the developer of the base code, who can identify the relevant database inputs and outputs
  • Some, but not necessarily all, test data may be created in a setup step; static reference data that are required for the code to work usually should not be created in setup
  • Testing code should be instrumented and logged liberally
  • The base code should be timed and a time limit included in the testing; this will help to quickly identify issues such as necessary indexes being dropped
  • Consideration should be given to running the unit test suites in performance and other instances

Design Pattern Use Case for Web Service Save Procedure

  • Purpose of procedure is to save a set of new records to a database table
  • Surrogate primary key is generated
  • Input is an array of objects with the records to be saved
  • Output is an array of objects containing the new primary key plus a description
  • For records failing validation, zero is returned, plus the error message, and the valid records will still be saved

ERD of Input and Output Data Structures in Relation to Scenarios

Unit Testing - ERD

  • In the diagram, a scenario corresponds to a web service call with a set of input records
  • The result of the call can be described as a set of output groups, each having a set of records
  • In our case the output array and the base table form two output groups, with a global group for average call timing
  • The logical diagram in terms of sets of records can be translated into an array structure diagram

Unit Testing - ASD

If we follow a similarly generic approach at the coding level, it becomes very easy to extend a simple example by adding groups, fields and records as necessary.

General Unit Test Design Process

The design process involves two high level steps

  • Identify a good set of scenarios with corresponding input records
  • Identify the expected outputs for each output group identified, for each scenario (there will also be a global group, for timing)

Design Pattern Scenarios

Unit Testing - HR

The procedure inserts records in Oracle's HR employees table, and we identify four test scenarios:

  1. Passing a single valid record
  2. Passing a single invalid record
  3. Trying to pass a record with an invalid type
  4. Passing multiple valid records, and one invalid record

Design Pattern Output Groups

  • Records inserted into table employees
  • Records returned in output parameter array
  • Timing of average call

Test Results Output

The output below is for a failing run, where the time limit is breached, and I also have deliberately entered incorrect expected values for two records, to show the difference in formatting between success and failureĀ output group records. I like to include the output tables on completion of development in my technical design document.

TRAPIT TEST: TT_Emp_WS.tt_AIP_Save_Emps
=======================================

SCENARIO 1: 1 valid record {
============================

    INPUTS
    ======

        GROUP Employee {
        ================

            Name  Email  Job      Salary
            ----  -----  -------  ------
            LN 1  EM 1   IT_PROG    1000

        }
        =

    OUTPUTS
    =======

        GROUP Employee: Actual = 1, Expected = 1 {
        ==========================================

            F?  Employee id  Name  Email  Job      Salary
            --  -----------  ----  -----  -------  ------
                       1513  LN 1  EM 1   IT_PROG    1000

        } 0 failed, of 1: SUCCESS
        =========================

        GROUP Output array: Actual = 1, Expected = 1 {
        ==============================================

            F?  Employee id  Description
            --  -----------  ----------------------------------
                       1513  ONE THOUSAND FIVE HUNDRED THIRTEEN

        } 0 failed, of 1: SUCCESS
        =========================

        GROUP Exception: Actual = 0, Expected = 0: SUCCESS
        ==================================================

} 0 failed, of 3: SUCCESS
=========================

SCENARIO 2: 1 invalid job id {
==============================

    INPUTS
    ======

        GROUP Employee {
        ================

            Name  Email  Job      Salary
            ----  -----  -------  ------
            LN 2  EM 2   NON_JOB    1500

        }
        =

    OUTPUTS
    =======

        GROUP Employee: Actual = 0, Expected = 0: SUCCESS
        =================================================

        GROUP Output array: Actual = 1, Expected = 1 {
        ==============================================

            F?  Employee id  Description
            --  -----------  -------------------------------------------------------------------
                          0  ORA-02291: integrity constraint (.) violated - parent key not found

        } 0 failed, of 1: SUCCESS
        =========================

        GROUP Exception: Actual = 0, Expected = 0: SUCCESS
        ==================================================

} 0 failed, of 3: SUCCESS
=========================

SCENARIO 3: 1 invalid number {
==============================

    INPUTS
    ======

        GROUP Employee {
        ================

            Name  Email  Job      Salary
            ----  -----  -------  ------
            LN 3  EM 3   IT_PROG   2000x

        }
        =

    OUTPUTS
    =======

        GROUP Employee: Actual = 0, Expected = 0: SUCCESS
        =================================================

        GROUP Output array: Actual = 0, Expected = 0: SUCCESS
        =====================================================

        GROUP Exception: Actual = 1, Expected = 1 {
        ===========================================

            F?  Error message
            --  -------------------------------------------------------------------------------
                ORA-06502: PL/SQL: numeric or value error: character to number conversion error

        } 0 failed, of 1: SUCCESS
        =========================

} 0 failed, of 3: SUCCESS
=========================

SCENARIO 4: 2 valid records, 1 invalid job id (2 deliberate errors) {
=====================================================================

    INPUTS
    ======

        GROUP Employee {
        ================

            Name  Email  Job      Salary
            ----  -----  -------  ------
            LN 4  EM 4   IT_PROG    3000
            LN 5  EM 5   NON_JOB    4000
            LN 6  EM 6   IT_PROG    5000

        }
        =

    OUTPUTS
    =======

        GROUP Employee: Actual = 2, Expected = 3 {
        ==========================================

            F?  Employee id  Name  Email  Job      Salary
            --  -----------  ----  -----  -------  ------
            F          1515  LN 4  EM 4   IT_PROG    3000
            >          1515  LN 4  EM 4   IT_PROG    1000
                       1517  LN 6  EM 6   IT_PROG    5000
            F
            >          1517  LN 6  EM 6   IT_PROG    5000

        } 2 failed, of 3: FAILURE
        =========================

        GROUP Output array: Actual = 3, Expected = 3 {
        ==============================================

            F?  Employee id  Description
            --  -----------  -------------------------------------------------------------------
                       1515  ONE THOUSAND FIVE HUNDRED FIFTEEN
                          0  ORA-02291: integrity constraint (.) violated - parent key not found
                       1517  ONE THOUSAND FIVE HUNDRED SEVENTEEN

        } 0 failed, of 3: SUCCESS
        =========================

        GROUP Exception: Actual = 0, Expected = 0: SUCCESS
        ==================================================

} 2 failed, of 7: FAILURE
=========================

TIMING: Actual = 222, Expected <= 2: FAILURE
============================================

SUMMARY for TT_Emp_WS.tt_AIP_Save_Emps
======================================

Scenario                                                 # Failed  # Tests  Status
-------------------------------------------------------  --------  -------  -------
1 valid record                                                  0        3  SUCCESS
1 invalid job id                                                0        3  SUCCESS
1 invalid number                                                0        3  SUCCESS
2 valid records, 1 invalid job id (2 deliberate errors)         2        7  FAILURE
Timing                                                          1        1  FAILURE
-------------------------------------------------------  --------  -------  -------
Total                                                           3       17  FAILURE
-------------------------------------------------------  --------  -------  -------

Timer Set: TT_Emp_WS.tt_AIP_Save_Emps, Constructed at 09 Jul 2016 13:32:40, written at 13:32:42
===============================================================================================
[Timer timed: Elapsed (per call): 0.01 (0.000013), CPU (per call): 0.02 (0.000020), calls: 1000, '***' denotes corrected line below]

Timer       Elapsed         CPU         Calls       Ela/Call       CPU/Call
-------  ----------  ----------  ------------  -------------  -------------
Setup          0.04        0.01             1        0.03700        0.01000
Caller         0.67        0.03             3        0.22200        0.01000
SELECT         0.60        0.00             3        0.19933        0.00000
(Other)        0.41        0.03             1        0.40500        0.03000
-------  ----------  ----------  ------------  -------------  -------------
Total          1.71        0.07             8        0.21325        0.00875
-------  ----------  ----------  ------------  -------------  -------------

Notes on output

  • In the event of the suite failing, as here, the utility code ensures that an Oracle error is generated. This can then be trapped by a calling Unix script from a scheduled Jenkins job to send out emails

Conclusions

  • A design pattern has been presented for database web service save procedures, with scenarios and output results
  • The implementation (presented in the part 2 article) was against an Oracle database publicly available demonstration schema, and used Brendan's database API testing framework
  • The main ideas could be applied with any database technology and any testing framework
  • It is suggested that any proposed alternative testing framework can be compared by implementing this design pattern, or similar
  • Three more design patterns are presented in further articles in this series, the second being for testing of views (Design Patterns for Database Unit Testing 2: Views 1 - Design)






 

Design Patterns for Database API Testing 1: Web Service Saving 2 - Code

Last October I gave a presentation on database unit testing with utPLSQL, Oracle Unit Testing with utPLSQL. I mentioned design patterns as a way of reducing the effort of building unit tests and outlined some strategies for coding them effectively.

In the current set of articles, I develop the ideas further, starting from the idea that all database APIs can be considered in terms of the axes:

  • direction (i.e. getter or setter, noting that setters can also 'get')
  • mode (i.e. real time or batch)

For each cell in the implied matrix, I construct an example API (or view) with specified requirements against Oracle's HR demo schema, and use this example to construct a testing program with appropriate scenarios as a design pattern. Concepts and common patterns and anti-patterns in automated API testing are discussed throughout, and these are largely independent of testing framework used. However, the examples use my own lightweight independent framework that is designed to help avoid many API testing anti-patterns. The code is available on GitHub here, BrenPatF/trapit_oracle_tester, and includes both framework and design pattern examples.

Behind the four examples, there is an underlying design pattern that involves wrapping the API call in a 'pure' procedure, called once per scenario, with the output 'actuals' array including everything affected by the API, whether as output parameters, or on database tables, etc. The inputs are also extended from the API parameters to include any other effective inputs. Assertion takes place after all scenarios and is against the extended outputs, with extended inputs also listed. This concept of the 'pure' function, central to Functional Programming, has important advantages in automated testing. I explained the concepts involved in a presentation at the Oracle User Group Ireland Conference in March 2018:

The Database API Viewed As A Mathematical Function: Insights into Testing


In this first example, I present a design pattern for web service 'save' procedures by means of a conceptual discussion, together with a working example of base code and test code for a procedure to save new employees in Oracle's well-known HR demonstration schema.

In part 1, Design Patterns for Database API Testing 1: Web Service Saving 1 - Design I began with an abstract discussion, and progressed to an example based on Oracle's HR demo schema. That article presented the test scenarios, logical data structures, and the example testing output. Here, I start by listing a number of extremely prevalent antipattern approaches to database API testing, and how to avoid them.

Following this, I list the code for the base procedure and the test package with notes against each section. There are three utility packages used, and I list the code for one procedure from those, that does all of the assertion and reporting from a single call. After the code listings, I provide some design material on the code packages used.

Design Patterns for Database API Testing 2: Views 1 - Design presents another design pattern following the same ideas.

API Testing Antipatterns

Automated unit testing of database APIs is often considered difficult and time-consuming. Unfortunately I believe it is made much worse by widespread following of antipattern approaches, some of which appear on popular database testing-related sites as examples to follow.

Here are a few antipatterns that I have identified, and which my code attached below avoids.

Test this then test this then...

Antipattern description

This occurs when the unit test code is written as a long sequence of method calls, each followed by assertions, and each usually having hard-coded data values in both method calls and assertions. It is a classic antipattern because it is widespread and developers often follow it thinking they are doing the right thing.

Antipattern consequence

The testing code becomes very long-winded and hard to follow, and tends to result in less rigorous testing.

Pattern alternative

Store all input data in arrays at the start, loop over the input scenarios array, accumulating the outputs in arrays, and loop over the output arrays for the assertions.

Method-based testing

Antipattern description

This occurs when the test suite is based on testing all the methods in a package, rather than units of behaviour, such as a procedure serving a web service.

Antipattern consequence

Ian Cooper explains very well in the video link below the adverse consequences of this antipattern from a Java perspective, but it applies equally to database testing.

  • It results in a great deal more testing code, with a lot of redundancy, which deters people from the whole concept of test-driven development
  • Re-factoring is more difficult because the unit test code tests the implementation
  • Shifting the focus of testing from the behavioural side is also unlikely to improve testing quality

TDD: Where Did It All Go Wrong?

Pattern alternative

Include in your test suite only tests of well-defined units of behaviour such as a web service entry point procedure, ignoring helper methods.

Field-level assertion

Antipattern description

This occurs when the individual fields written to the database or in output arrays have their own assertions.

Antipattern consequence

Real database applications often have tables withĀ large numbers of fields, and the numbers of assertions can consequently become very large.

Pattern alternative

Assert at the record level by concatenating the fields in a record into one string.

Coupled tests

Antipattern description

This occurs when testing one scenario affects another; for example, when all the test data are created at once and not rolled back after each scenario and re-created as needed for the next.

This antipattern is strongly promoted by some popular testing frameworks, where package level setup and teardown are mandatory, at least by default. My own framework deliberately does not support these, preferring the test program to call its own private procedures as necessary at the appropriate level.

Antipattern consequence

The coding becomes more complex as it is necessary to distentangle what results from earlier test scenarios from that of the current scenario.

Pattern alternative

Set up test data at the scenario level, not at the procedure level, and roll it back at the end of the scenario testing.

Opaque output

Antipattern description

This occurs when the output does not show what was tested.

Antipattern consequence

It is harder to review the testing, especially when combined, as it usually is, with theĀ Test this then test this then... antipattern. This results in lower quality.

Pattern alternative

Make the output self-documenting with clear scenario descriptions and listings of entities and fields that are being tested.

Emp_WS.AIP_Save_Emps - Base Procedure

PROCEDURE AIP_Save_Emps (p_emp_in_lis           ty_emp_in_arr,     -- list of employees to insert
                         x_emp_out_lis      OUT ty_emp_out_arr) IS -- list of employee results
 
  l_emp_out_lis        ty_emp_out_arr;
  bulk_errors          EXCEPTION;
  PRAGMA               EXCEPTION_INIT (bulk_errors, -24381);
  n_err PLS_INTEGER := 0;

BEGIN

  FORALL i IN 1..p_emp_in_lis.COUNT
    SAVE EXCEPTIONS
    INSERT INTO employees (
        employee_id,
        last_name,
        email,
        hire_date,
        job_id,
        salary
    ) VALUES (
        employees_seq.NEXTVAL,
        p_emp_in_lis(i).last_name,
        p_emp_in_lis(i).email,
        SYSDATE,
        p_emp_in_lis(i).job_id,
        p_emp_in_lis(i).salary
    )
    RETURNING ty_emp_out_obj (employee_id, To_Char(To_Date(employee_id,'J'),'JSP')) BULK COLLECT INTO x_emp_out_lis;

EXCEPTION
  WHEN bulk_errors THEN

    l_emp_out_lis := x_emp_out_lis;

    FOR i IN 1 .. sql%BULK_EXCEPTIONS.COUNT LOOP
      IF i > x_emp_out_lis.COUNT THEN
        x_emp_out_lis.Extend;
      END IF;
      x_emp_out_lis (SQL%Bulk_Exceptions (i).Error_Index) := ty_emp_out_obj (0, SQLERRM (- (SQL%Bulk_Exceptions (i).Error_Code)));
    END LOOP;

    FOR i IN 1..p_emp_in_lis.COUNT LOOP
      IF i > x_emp_out_lis.COUNT THEN
        x_emp_out_lis.Extend;
      END IF;
      IF x_emp_out_lis(i).employee_id = 0 THEN
        n_err := n_err + 1;
      ELSE
        x_emp_out_lis(i) := l_emp_out_lis(i - n_err);
      END IF;
    END LOOP;

END AIP_Save_Emps;

Notes on base procedure

  • There is a FORALL statement to insert a record into employees for each record in the input list in one batch
  • The RETURNING clause uses a BULK COLLECT to return the new id and its value in words to the output list
  • SAVE EXCEPTIONS means valid records will be inserted, while invalid ones will fall to the EXCEPTION block
  • Invalid records cause a zero and the error message to be interpolated into the output array
  • I tried to follow a similar principle here to that in Connor McDonald's article,Ā Assume the Best; Plan for the Worst - Oracle Magazine Article

TT_Emp_WS.tt_AIP_Save_Emps - Testing Package Procedure

Declaration section

PROCEDURE tt_AIP_Save_Emps IS

  c_proc_name CONSTANT  VARCHAR2(61) := 'TT_Emp_WS.tt_AIP_Save_Emps';

  c_ln_prefix             CONSTANT VARCHAR2(20) := 'LN ';
  c_em_prefix             CONSTANT VARCHAR2(20) := 'EM ';

  c_ln                    CONSTANT L1_chr_arr := L1_chr_arr (
        c_ln_prefix || '1', c_ln_prefix || '2', c_ln_prefix || '3', c_ln_prefix || '4', c_ln_prefix || '5', c_ln_prefix || '6');
  c_em                    CONSTANT L1_chr_arr := L1_chr_arr (
        c_em_prefix || '1', c_em_prefix || '2', c_em_prefix || '3', c_em_prefix || '4', c_em_prefix || '5', c_em_prefix || '6');
  c_job_id                CONSTANT VARCHAR2(20) := 'IT_PROG';
  c_job_id_invalid        CONSTANT VARCHAR2(20) := 'NON_JOB';

  c_salary                CONSTANT L1_chr_arr := L1_chr_arr ('1000', '1500', '2000x', '3000', '4000', '5000');

  c_params_3lis           CONSTANT L3_chr_arr := L3_chr_arr (
          L2_chr_arr (L1_chr_arr (c_ln (1),    c_em (1),        c_job_id,         c_salary (1))), -- valid
          L2_chr_arr (L1_chr_arr (c_ln (2),    c_em (2),        c_job_id_invalid, c_salary (2))), -- invalid
          L2_chr_arr (L1_chr_arr (c_ln (3),    c_em (3),        c_job_id,         c_salary (3))), -- invalid salary, nan
          L2_chr_arr (L1_chr_arr (c_ln (4),    c_em (4),        c_job_id,         c_salary (4)),  -- valid
                      L1_chr_arr (c_ln (5),    c_em (5),        c_job_id_invalid, c_salary (5)),  -- invalid job id
                      L1_chr_arr (c_ln (6),    c_em (6),        c_job_id,         c_salary (6)))  -- valid
  );
  g_exp_3lis                   L3_chr_arr;

  c_ws_ms_limit          CONSTANT PLS_INTEGER := 2;
  c_scenario_lis         CONSTANT L1_chr_arr := L1_chr_arr (
                               '1 valid record',
                               '1 invalid job id',
                               '1 invalid number',
                               '2 valid records, 1 invalid job id (2 deliberate errors)'
  );
  c_inp_group_lis       CONSTANT L1_chr_arr := L1_chr_arr ('Employee');
  c_inp_field_2lis      CONSTANT L2_chr_arr := L2_chr_arr (
                                                        L1_chr_arr (
                                                                'Name',
                                                                'Email',
                                                                'Job',
                                                                '*Salary')
  );
  c_out_group_lis       CONSTANT L1_chr_arr := L1_chr_arr ('Employee', 'Output array', 'Exception');
  c_out_field_2lis      CONSTANT L2_chr_arr :=  L2_chr_arr (
                                    L1_chr_arr ('*Employee id', 'Name', 'Email', 'Job', '*Salary'),
                                    L1_chr_arr ('*Employee id', 'Description'),
                                    L1_chr_arr ('Error message')
  );
  l_timer_set           PLS_INTEGER;
  l_inp_3lis            L3_chr_arr := L3_chr_arr();

  l_act_3lis         L3_chr_arr := L3_chr_arr();

Notes on declaration section

  • Observe that the declarations of l_inp_3lis and g_exp_3lis as 3-level generic lists map directly to Scenario Inputs and Scenario Outputs in the array structure diagram in Design Patterns for Database API Testing 1: Web Service Saving 1 - Design
  • c_scenario_lis, as a generic list maps directly to Scenario Descriptions
  • Adding or subtracting fields in the input object means simply adding or subtracting fields in the inner level of the list
  • Adding test scenarios (i.e. calls) amounts to adding outer level records in each list
  • Using the program as a template for other unit test programs is therefore very easy
  • All inputs and outputs are printed in the output, making the testing self-documenting

Setup

  PROCEDURE Setup_Array IS
    l_last_seq_val         PLS_INTEGER;
  BEGIN

    SELECT employees_seq.NEXTVAL
      INTO l_last_seq_val
      FROM DUAL;

    g_exp_3lis := L3_chr_arr ( -- each call results in a list of 2 output lists: first is the table records; second is the out array
                        L2_chr_arr (L1_chr_arr (Utils.List_Delim (To_Char(l_last_seq_val+1), c_ln (1), c_em (1), c_job_id, c_salary (1))), -- valid char, num pair
                                    L1_chr_arr (Utils.List_Delim (To_Char(l_last_seq_val+1), To_Char(To_Date(l_last_seq_val+1,'J'),'JSP'))),
                                    Utils_TT.c_empty_list
                        ),
                        L2_chr_arr (Utils_TT.c_empty_list,
                                    L1_chr_arr (Utils.List_Delim (0, 'ORA-02291: integrity constraint (.) violated - parent key not found')),
                                    Utils_TT.c_empty_list
                        ),
                        L2_chr_arr (Utils_TT.c_empty_list,
                                    Utils_TT.c_empty_list,
                                    L1_chr_arr ('ORA-06502: PL/SQL: numeric or value error: character to number conversion error')
                        ),
                        L2_chr_arr (L1_chr_arr (Utils.List_Delim (To_Char(l_last_seq_val+3), c_ln (4), c_em (4), c_job_id, c_salary (1)), -- c_salary (1) should be c_salary (4)
                                                Utils.List_Delim (To_Char(l_last_seq_val+5), c_ln (6), c_em (6), c_job_id, c_salary (6)),
                                                Utils.List_Delim (To_Char(l_last_seq_val+5), c_ln (6), c_em (6), c_job_id, c_salary (6))), -- duplicate record to generate error
                                    L1_chr_arr (Utils.List_Delim (To_Char(l_last_seq_val+3), To_Char(To_Date(l_last_seq_val+3,'J'),'JSP')),
                                                Utils.List_Delim (0, 'ORA-02291: integrity constraint (.) violated - parent key not found'),
                                                Utils.List_Delim (To_Char(l_last_seq_val+5), To_Char(To_Date(l_last_seq_val+5,'J'),'JSP'))),
                                    Utils_TT.c_empty_list
                        )
                     );

    l_act_3lis.EXTEND (c_params_3lis.COUNT);
    l_inp_3lis.EXTEND (c_params_3lis.COUNT);

    FOR i IN 1..c_params_3lis.COUNT LOOP

      l_inp_3lis(i) := L2_chr_arr();
      l_inp_3lis(i).EXTEND(1);
      l_inp_3lis(i)(1) := L1_chr_arr();
      l_inp_3lis(i)(1).EXTEND(c_params_3lis(i).COUNT);
      FOR j IN 1..c_params_3lis(i).COUNT LOOP

        l_inp_3lis(i)(1)(j) := Utils.List_Delim (c_params_3lis(i)(j)(1), c_params_3lis(i)(j)(2), c_params_3lis(i)(j)(3), c_params_3lis(i)(j)(4));

      END LOOP;

    END LOOP;

  END Setup_Array;

Notes on Setup

  • Setup gets a value from the sequence in order to determine what values are expected in the outputs
  • Any concurrent calls could invalidate this expectation, but scheduling the test runs may avoid this; if not, one could change the expectation to a category, of say positive integer in a certain range
  • The example has two output groups, corresponding to a table and an output array
  • Observe how easy it would be to extend to extra tables or arrays, just by adding records in the middle level of the list
  • c_empty_list is a 1-record list containing a string 'EMPTY', which is used to simplify assertion of empty lists
  • Setup in this case has no need to create test data, as we have chosen to reference pre-existing departments
  • There is no teardown as the inserts from the procedure call are rolled back after each call
  • The call to Write_Times writes out the report from the package level timer set, including setup timings (if any)

'Pure' API wrapper procedure

  PROCEDURE Purely_Wrap_API (p_inp_2lis        L2_chr_arr,       -- input list of lists (record, field)
                             x_act_2lis    OUT L2_chr_arr) IS    -- output list of lists (group, record)

    l_emp_out_lis       emp_out_arr;
    l_tab_lis           L1_chr_arr;
    l_arr_lis           L1_chr_arr;
    l_err_lis           L1_chr_arr;

    -- Do_Save makes the ws call and returns o/p array
    PROCEDURE Do_Save (x_emp_out_lis OUT emp_out_arr) IS
      l_emp_in_lis        emp_in_arr := emp_in_arr();
    BEGIN

      FOR i IN 1..p_inp_2lis.COUNT LOOP
        l_emp_in_lis.EXTEND;
        l_emp_in_lis (l_emp_in_lis.COUNT) := emp_in_rec (p_inp_2lis(i)(1), p_inp_2lis(i)(2), p_inp_2lis(i)(3), p_inp_2lis(i)(4));
      END LOOP;

      Timer_Set.Init_Time (p_timer_set_ind => l_timer_set);
      Emp_WS.AIP_Save_Emps (
                p_emp_in_lis        => l_emp_in_lis,
                x_emp_out_lis       => x_emp_out_lis);
      Timer_Set.Increment_Time (p_timer_set_ind => l_timer_set, p_timer_name => Utils_TT.c_call_timer);

    END Do_Save;

    -- Get_Tab_Lis: gets the database records inserted into a generic list of strings
    PROCEDURE Get_Tab_Lis (x_tab_lis OUT L1_chr_arr) IS
    BEGIN

      SELECT Utils.List_Delim (employee_id, last_name, email, job_id, salary)
        BULK COLLECT INTO x_tab_lis
        FROM employees
       WHERE ttid = Utils.c_session_id_if_TT
       ORDER BY employee_id;
      Timer_Set.Increment_Time (p_timer_set_ind => l_timer_set, p_timer_name => 'SELECT');

    EXCEPTION
      WHEN NO_DATA_FOUND THEN NULL;
    END Get_Tab_Lis;

    -- Get_Arr_Lis converts the ws output array into a generic list of strings
    PROCEDURE Get_Arr_Lis (p_emp_out_lis emp_out_arr, x_arr_lis OUT L1_chr_arr) IS
    BEGIN

      IF p_emp_out_lis IS NOT NULL THEN

        x_arr_lis := L1_chr_arr();
        x_arr_lis.EXTEND (p_emp_out_lis.COUNT);
        FOR i IN 1..p_emp_out_lis.COUNT LOOP

          x_arr_lis (i) := Utils.List_Delim (p_emp_out_lis(i).employee_id, p_emp_out_lis(i).description);

        END LOOP;

      END IF;

    END Get_Arr_Lis;

  BEGIN

    BEGIN

      Do_Save (x_emp_out_lis => l_emp_out_lis);
      Get_Tab_Lis (x_tab_lis => l_tab_lis);
      Get_Arr_Lis (p_emp_out_lis => l_emp_out_lis, x_arr_lis => l_arr_lis);

    EXCEPTION
      WHEN OTHERS THEN
        l_err_lis := L1_chr_arr (SQLERRM);
    END;

    x_act_2lis := L2_chr_arr (Utils_TT.List_or_Empty (l_tab_lis), Utils_TT.List_or_Empty (l_arr_lis), Utils_TT.List_or_Empty (l_err_lis));
    ROLLBACK;

  END Purely_Wrap_API;

Notes on main testing procedure

  • The call converts the generic input data into the objects required by the base procedure, then calls the procedure and sets the resulting records for each output group
  • Purely_Wrap_API has a very simple main section that calls local procedures to split the logic into three sections
  • Its EXCEPTION clause captures the scenario where an invalid type conversion is attempted
  • Its output is a generic 2-level list that is mapped to a record in the outer-level output list
  • Empty lists are converted to c_empty_list, the 1-record list mentioned earlier
  • The procedure ends with a rollback

Main section

BEGIN
--
-- Every testing main section should be similar to this, with array setup, then loop over scenarios
-- making a 'pure'(-ish) call to specific, local Purely_Wrap_API, with single assertion call outside
-- the loop
--
  l_timer_set := Utils_TT.Init (c_proc_name);
  Setup_Array;
  Timer_Set.Increment_Time (l_timer_set, 'Setup_Array');

  FOR i IN 1..c_params_3lis.COUNT LOOP

    Purely_Wrap_API (c_params_3lis(i), l_act_3lis(i));

  END LOOP;

  Utils_TT.Is_Deeply (c_proc_name, c_scenario_lis, l_inp_3lis, l_act_3lis, g_exp_3lis, l_timer_set, c_ws_ms_limit,
                      c_inp_group_lis, c_inp_field_2lis, c_out_group_lis, c_out_field_2lis);

EXCEPTION
  WHEN OTHERS THEN
    Utils.Write_Other_Error;
    RAISE;
END tt_AIP_Save_Emps;

Notes on main section

  • The main section is very simple: a local procedure Purely_Wrap_API is called for each input scenario
  • The call converts the generic input data into the objects required by the base procedure, then calls the procedure and sets the resulting records for each output group
  • Is_Deeply is a generic utility procedure that does all the assertions and calls the results reporting utility
  • This can be used without change for any number of output groups with any numbers of fields, of any types

Utils_TT - API Test Utilities Package
We will include only one procedure from this package in the body of the article. See gitHub link for the full code.
Is_Deeply - to check results from testing

PROCEDURE Is_Deeply (p_proc_name                 VARCHAR2,      -- calling procedure
                     p_test_lis                  L1_chr_arr,    -- test descriptions
                     p_inp_3lis                  L3_chr_arr,    -- actual result strings
                     p_act_3lis                  L3_chr_arr,    -- actual result strings
                     p_exp_3lis                  L3_chr_arr,    -- expected result strings
                     p_timer_set                 PLS_INTEGER,   -- timer set index
                     p_ms_limit                  PLS_INTEGER,   -- call time limit in ms
                     p_inp_group_lis             L1_chr_arr,    -- input group names
                     p_inp_fields_2lis           L2_chr_arr,    -- input fields descriptions
                     p_out_group_lis             L1_chr_arr,    -- output group names
                     p_fields_2lis               L2_chr_arr) IS -- test fields descriptions

  l_num_fails_sce                L1_num_arr :=  L1_num_arr();
  l_num_tests_sce                L1_num_arr :=  L1_num_arr();
  l_tot_fails                    PLS_INTEGER := 0;
  l_tot_tests                    PLS_INTEGER := 0;

.
.
.
(private procedures - see gitHub project, https://github.com/BrenPatF/trapit_oracle_tester, for full code listings)
.
.
.
BEGIN

  Detail_Section (l_num_fails_sce, l_num_tests_sce);
  Summary_Section (l_num_fails_sce, l_num_tests_sce, l_tot_fails, l_tot_tests);
  Set_Global_Summary (l_tot_fails, l_tot_tests + 1);

END Is_Deeply;

Notes on Is_Deeply

  • This is the base version of Is_Deeply with 3-level arrays of expected and actuals
  • An inner loop asserts actual values (which are records) against expected
  • The final assertion is against average call time
  • It is expected that all assertion within a test procedure will be via a single call to one of the versions of this procedure, making a big reduction in code compared with traditional unit testing approaches
  • After final assertion a call is made to write out all the results, by scenario, with all inputs printed first, followed by actuals (and expected, where they differ); this means that the test outputs now become precise and accurate documents of what the program does

Package Usages Diagram

Call Structure Table - TT_Emp_WS.tt_AIP_Save_Emps
Installation Instructions

See TRAPIT - TRansactional API Testing in Oracle for a link to the installation code for both the framework and the demo code on gitHub, and links to related articles.

Exercise
If you are interested in following a similar approach to testing, you might find this exercise an easy way to get going: Revise the base procedure to return error records as a second output array, with the first output returning only valid records. You do not need to define any new types for this. Update the Ā unit test code for the new signature.

Conclusions






Data Modelling of XML SOAP Documents

I have been involved in a number of projects where web services are used for interfacing, and have generally found a high level of complexity in their use compared with traditional interfacing methods. One of the areas of complexity lies in converting between the XML messages and the fields and arrays used in conventional programming languages such as Oracle PL/SQL. Often this is handled in an unmodular way with request messages being manually built, and the response message being parsed by individual xpath searches for specific strings.

My approach is to handle these conversions in a generic layer that lies between the client applications and the low-level APIs provided by the programming language for HTTP calls and XML processing. This post deals with a data model and the data structures used in the PL/SQL package that I wrote for calling web services from PL/SQL. I have posted on the program itself here: A Layered Approach To Processing XML Web ServicesĀ and expect to post on examples of use at a later date.

Web Services
XML web services have become a standard mechanism for interfacing data between applications across the internet. The advantage that they have is that a standard transfer mechanism (HTTP, Hypertext Transfer Protocol) is used, and the data are formatted according to a standard specification, regardless of the technologies in which the applications are implemented. The data formats are described in a Web Services Description Language (WSDL) file, and interfacing is effected using SOAP (Simple Object Access Protocol) messages, as specified by the World Wide Web Consortium (W3C).

Web services form the cornerstone of Service Oriented Architecture (SOA), which Oracle uses in its Application Integration Architecture (AIA). A good acronym dictionary is vital for working in these areas šŸ™‚

Interfacing by web services consists of sending input data as a text string in XML format by an HTTP request, and receiving an HTTP response, also as a text string in XML format.

My Web Service Interface Program
The layer package is intended to handle any data structures that can be represented in XML. On the request side, the client application will call layer APIs to add records and elements to build the request structure without having to write any XML directly, and the layer will construct the XML SOAP message and call the web service. The response side relies on a generic data structure comprising two hierarchical arrays: a structure array that specifies the group structure of the response XML message (or a subset of it), and a data array that holds the data with pointers to the structure array. The structure array forms an input, and is used by the layer to call standard XML APIs (Oracle XML APIs in my implementation) to retrieve the data from the response. The data modelling and conceptual framework are not language-specific, while my implementation is in Oracle PL/SQL.

SOAP Data Model
Both input and output of a web service call include an XML SOAP message, which is a text string consisting of the data in the form of an XML hierarchy. The elements in the hierarchy contain a mandatory name field, plus optional namespace and value, optional list of child elements, and an optional list of attributes. The hierarchy must contain certain standard elements and attributes, within a structure shown in this skeleton SOAP message (modified from an example here SOAP Tutorial/SOAP Syntax, by W3Schools.com ):

In addition to the standard elements and attributes, there may be application specific elements, attributes and values as indicated by the ellipses.The hierarchy of elements can be represented as below, where the group entity on the left represents the fact that a number of elements at a given level may be implicitly grouped, although without this grouping being explicit in the SOAP document. We use this grouping in the data structures on the response side but not on the request side:


Each element in an XML document has a name, an optional value, optional attribute name/value pairs, with the names being optionally name space-qualified, and the element may contain child elements. Please refer to widely available documentation on the SOAP protocol for further information; here's a link I found useful: A Busy Developer's Guide to SOAP 1.1

Request Side Data Structures

The diagram shows the main data structures that we use for building the request. Boxes with double borders represent arrays, and a solid arrow represents a pointer from a field to an array element.

The XML Tree List structure is derived from the general model above by treating the list of attributes as a string to be included in the element entity, and corresponds to the array specified by data type xml_tree_list_type in the following table. This array is built from procedure calls by the client application, and is processed internally by a recursive procedure to construct the SOAP request message. Note that as well as the XML tree types, we have defined two additional list structures for convenience of parameter passing:

  • Name Space List for passing name spaces
  • XML Field List as it is often convenient to treat a group of fields as a record, consisting of a parent element and a list of child fields

Name Space List Data Types

Type Name Element Category Description
name_space_type Object Name space
ns_label Character Name space label
ns_address Character Name space address
name_space_list_type Array List of name spaces
(unnamed element) name_space_type* Name space

Field List Data Types

Type Name Element Category Description
field_type Object XML field
field_name Character Field name
field_value Character Field value (optional)
field_list_type Array List of XML fields
(unnamed element) field_type* XML field

XML Tree List Data Types

Type Name Element Category Description
xml_tree_type Object Record in the XML tree holding the data for an XML element
parent_id Integer Index to parent record in the XML tree
ns_label Character Name space label (optional)
field_name Character Field name
field_value Character Field value (optional)
attr_string Character Attribute string, including name and value (optional)
xml_tree_list_type Array XML tree
(unnamed element) xml_tree_type* Record in the XML tree

Response Side Data Structures

The diagram shows the main data structures that we use for processing the response. The broken arrow between the two arrays signifies that each nested child list in the data tree corresponds to a child group record in the structure tree.

The Group Structure Tree List array defines the group structure of the output data structure. To simplify input, this is first set up in an unnested structure, where a field points to its parent, by procedure calls from the client application. The program then converts this into the nested structure shown below it, where the children are included in the parent record, which is more suitable for the later processing. Note that the hierarchy may be, and usually is, a subset of the actual SOAP response, since typically layers are present in a SOAP response message that are not useful for the client application. Also note that in the event of a standard error response being returned from the web service, the group structure is replaced by that for the error response by the program.

  • Each of the arrays (at the top level) has a root element without a parent, and these records have null values other than for their respective child lists
  • The model can represent any number of hierarchy levels

Group Structure Data Types

Type Name Element Category Description
int_list_type Array List of integers
(unnamed element) Integer Integer
structure_tree_type Object Structure tree record
group_name Character The group tag, used as a search string
ns_prefix Character Name space prefix
attr_string Character An attribute string that has to be included in searches where the group name is ambiguous (Oracle JDeveloper uses array as a group tag for arrays, with an attribute to differentiate)
child_list int_list_type*
structure_tree_list_type Array Structure tree list
(unnamed element) structure_tree_type* Structure tree record

Data Structure Data Types

Type Name Element Category Description
child_group_list_type Array List of indexes in the data tree list of child groups of current data tree record
(unnamed element) int_list_type* List of indexes in the data tree list of child records of current group
data_tree_type Object Data tree record
field_list field_list_type* Field list (specified in input side)
child_group_list child_group_list_type* List of indexes in the data tree list of child groups of current data tree record
data_tree_list_type Array Data tree list
(unnamed element) data_tree_type* Data tree record

Generic Data Models
In database design it is well known that overly generic data models lead to poor performance and extra application complexity. The web service interfacing model is of course highly generic, and it should be noted that the same problems may indeed offset the acknowledged standardisation advantages. The data model and structures described here necessarily reflect the genericity of the underlying architecture, while the approach taken is intended to reduce application complexity by moving much of it into a callable module.