Automation Frame work Maintenance and Enhancement using Selenium RC

CLIENT PROFILE

Client is a community service organization that has been transforming the lives of Australians in need for more than 150 years. Client provides:

  • Accommodation and support programs for the homeless
  • Creative services for homeless youth
  • Counseling and support programs for families
  • Accommodation and support for people with disabilities

With over 3,500 employees, volunteers, Board members, and supporters, the organization advocates for disadvantaged Australians and helps them get back on their feet. Working alongside government, corporate Australia, churches, and the wider community, the client is determined to overcome disadvantage across the nation.

BUSINESS SOLUTION

The client had a web-based, secure client record management system that helps create and maintain electronic records. The application used advanced Ajax controls and Selenium RC for automation testing. Scripts were written in Java using Eclipse and executed via an existing Excel-driven framework.

Challenges faced:

  • Incremental feature updates changed workflows, causing existing scripts to fail
  • The framework produced memory errors when executing scripts in batch mode
  • Pop-ups (single and multiple) were not handled by Selenium, requiring additional handling
  • New features required framework updates and new script development

The client wanted:

  • Existing failing scripts fixed
  • Pop-up handling built into the framework
  • New scripts developed for added functionality

SOLUTION

Adactin proposed and implemented a gradual update process, divided into 5 phases:

Phase 1 – Analysis of Memory Issues in Framework

  • Identified root cause: 32-bit Eclipse had limited Java heap size
  • Migrated to 64-bit Eclipse and OS, resolving memory issues
  • Set up 64-bit automation framework system:
    • Installed Eclipse
    • Setup Selenium RC
    • Added external libraries and JARs
    • Installed Java and AutoIt

Phase 2 – Analysis of Existing Failing Scripts

  • Reviewed application functionalities
  • Set up test environment and automation framework
  • Executed all existing scripts
  • Logged and categorized issues by complexity and resolution time
  • Identified Java code issues not correctly executing methods related to workflows

Phase 3 – Fixing of Failed Scripts

  • Environment setup completed
  • Addressed issues:
    • Entered data in TestData.xls for new functionality
    • Added new keywords to framework and scripts
    • Debugged and fixed Java code and variables
    • Entered and corrected missing/incorrect XPaths
  • Implemented Pop-up Handling:
    • Analyzed workflows triggering pop-ups
    • Created pop-up-specific AutoIt EXE
    • Integrated AutoIt EXE calls in the scripts

Phase 4 – Writing New Scripts

  • Created new scripts for added functionality:
    • Defined test steps in Framework.xls
    • Added required data in TestData.xls
    • Entered XPaths for new fields
    • Developed specific Java files in Eclipse
    • Executed all new scripts

Phase 5 – Knowledge Transfer

  • Created comprehensive update and fix reports
  • Delivered knowledge transfer sessions covering framework enhancements
  • Trained client’s internal automation analysts

BENEFITS

  • 64-bit setup enabled faster execution; scripts now run overnight in batch mode
  • Automation regression testing saved 40% of testing time
  • Pop-ups successfully handled using AutoIt
  • Reduced script execution time led to increased productivity
  • Provided debug and enhancement documentation, aiding long-term maintenance of the framework

TECHNOLOGY STACK

  • OS – Windows 2007
  • Database – SQL Server 2000
  • Programming Language – Java
  • Automation Tool – Selenium RC
  • Pop-up Handling – AutoIt

ASSISTANCE PROVIDED BY CLIENT RESOURCES

  • IT team assisted in setting up the 64-bit machine
  • Testing team helped with setting up the environment
  • Testing team provided test data and scripts

ETL/Data Warehouse Testing of a GIS Spatial Application

CLIENT PROFILE

Client is a reputed organization that deals with various planning and environmental aspects. The client uses geospatial technology in the delivery of planning and development services, such as the online lodgment and tracking of applications, viewing planning information on a web-based interactive map, and providing new ways for stakeholders to engage with the planning process.

CLIENT OVERVIEW

Our client, a leading educational authority at the national level, recognized the need for a comprehensive assessment platform capable of measuring students’ competencies across diverse subjects and skill domains. They envisioned a system that not only evaluates academic knowledge but also fosters critical thinking and problem-solving abilities among students in primary and secondary schools nationwide.

The tools and services developed by the client help businesses and communities access and transact with planning services from anywhere, anytime. The client deals with a massive volume of historical and current data from multiple sources and departments, each with their own data standards. To integrate these, an ETL Tool (FME Workbench 2015) was used to enable flawless data transformation across departments. Data is uploaded to Google Map Engine (GME) and Microsoft Azure for visualization on the web application.

BUSINESS SITUATION

This GIS Spatial Web-based Application uses Google Maps Engine to help users quickly navigate the planning rules that apply to individual land parcels, or search for properties that match certain planning controls.

It has been developed primarily for councils and professionals such as architects, builders, certifiers, developers, and planners. This is a cloud-based product developed on Azure and GME (Google Map Engine) cloud platforms.

As the client deals with large volumes of everyday and historical data from multiple sources and departments, each with their own data standards, ETL/Data Warehouse and Security Testing was performed by Adactin to ensure smooth Data Extraction, Transformation, and Loading (ETL) into the two clouds (GME and Azure) without any data loss or truncation, using an ETL tool as an integrator.

This data upload occurs on a monthly basis for the Web Application, ensuring that users of the GIS Spatial Web Application always view the latest data.

The Web Application supports various functionalities including:

  • Basic Search
  • Advanced Search
  • 46 different layers spanning the whole of NSW
  • Many more features

The portal also provides links to Environmental Planning Instruments – including LEPs or State planning policies – hosted on a government website, which remains the authoritative source.

TECHINICAL SITUATION

As the web application was built on a Cloud Platform and data coming every day from the multiple sources and different departments in multiple formats, various ETLs with complicated logics were running at different stages to standardize the whole bunch of data into one standard format and then uploading that standardized data into the Google Map Engine Cloud and the Microsoft Azure Cloud.

Adactin provided a solution that helped to check the data from multiple sources, rectifying the data with inconsistent formats, find the incompatible and duplicate entries, loss of data while ETL process was running, find the wrongly entered data and enrich the data by verifying the ETL logic is properly working through FME Workbench and verify that all the data is transformed from the multiple sources into one standardized normalized data at one place without dropping any data, data loss, truncation, no null records are introduced etc.

Once all the data is transformed into one standardized format, then the client transforms that data onto the 2 clouds (GME and AZURE) via the FME Workbench 2015. Here again, Adactin provided a solution to check the ETL logics are working fine and transferring all the whole bunch of data to both the clouds according to various requirements and rules without any data loss and truncation, without changing the data format and disturbing the Schemas.

Client was using multiple accounts for various tool and cloud accessing, here Adactin did the Security Testing as well to check that only the authorized accounts are able to access the only allowed services and vice versa. Data between both the clouds is interrelated so a unique Asset ID was provided so that both the clouds represent the data properly in the client Web based Solution which provides the various planning services of the council’s data of NSW. Adactin also performed the Production Verification Test.

Various types of testing is conducted to test this product

SOLUTION

Adactin proposed and implemented a test process which was divided into 5 Phases for ETL/Data Warehouse Testing of a GIS Web-based Application.

Phase-1 – Test Plan

  • Liaise and consult Client test team to gather complete ETL and Functional requirements
  • Planning for Data Management and Transformation testing
  • Testing team spent time with SMEs to understand end-to-end business processes and business needs for the ETLs
  • Creation of high-level test scenarios

Phase-2 – ETL Test Cases Design/Enhance

  • Design of detailed test cases for system and integration ETL testing for the data transformation
  • Tools and software setup for ETL data transformation and migration testing
  • Test data setup

Phase-3 – Execution Cycle-1 – ETL/Data Warehouse Testing

  • Tools and software setup
  • Test data setup in the test environment
  • Execution of ETL test cases stage-wise as described below
  • Logging of bugs in the bug management tool
  • Publishing of test results

Stage 1 (Business Rules) – Testing to verify that ETL data transformation adhered to all specified requirements and business rules.
Stage 2 (BAU_DATA) – Testing of extracted data from multiple BAU sources. Data cleaning testing was performed.

Testing Type: Data Transformation Testing
Test Description:
This testing was done mainly for the data which was externally injected through Python scripts once the data was in one standardized format. Python code was implemented to extract the data and convert it into the desired format.
The testing team was involved in verifying the Python code logic and workflows. The client pulled data from multiple BAU sources, applied ETL logic, converted it into the target database, and uploaded the data into GME and Azure clouds.
The testing team validated FME Workbench ETL logic, database accuracy, and data upload activities.

Testing Type: Security Testing
Security testing was done to ensure that different user accounts had the correct rights/permissions in the respective environments.
Negative testing was conducted to confirm that accounts did not have unauthorized access. This was executed across multiple environments.

Testing Type: Cross-Browser and Functional Testing
This product was developed for both community and business use. The testing team validated compatibility across various operating systems and browsers.

Stage 3 (ETL_INTERNAL) – Verified that inconsistent data formats from multiple sources were standardized and transformed without any data loss or truncation.
Stage 4 (R1STD_DATA) – Confirmed complete data transformation from multiple BAU sources into a standardized format without data loss or truncation.
Stage 5 (PCO_Application Injection) – Verified SQL-based injection of external PCO links into the web application.
Stage 6 (ETL_EXTERNAL) – Ensured standardized data was uploaded to GME and Azure without data loss or schema issues.
Stage 7 (GME_DATA) – Validated successful data migration to GME Cloud via FME Workbench 2015.
Stage 8 (AZURE_DATA) – Validated successful data migration to Azure Cloud via FME Workbench 2015.
Stage 9 (SECURITY TESTING) – Confirmed that authorized accounts had proper access, and unauthorized users were restricted.
Stage 10 (PRODUCTION VERIFICATION TESTING) – Compared standardized source data with GME and Azure data using FME Workbench 2015.

Two data paths tested:

  • Standardised Source Data → ETL TOOL (FME Desktop) → GME Cloud
  • Standardised Source Data → ETL TOOL (FME Desktop) → Azure Cloud

Phase-4 – Regression Testing Cycle for the Front-End GIS Web-Based Application

  • Test data setup in the test environment
  • Execution of regression test cases for the application
  • Cross-browser and OS/device compatibility testing by parallel teams
  • Publishing of test results

Phase-5 – Security Testing (Non-Functional Testing)

  • Setup of test and production environments and user configurations
  • Design and execution of test scenarios
  • Publishing of test results

BENEFITS

Find below benefits of the technical solution proposed to the client:

  • Team logged more than 300 issues as part of testing phases, leading to improved quality of the application.
  • All key business rules were thoroughly tested within the limited timeframe to ensure correct functioning. No production issues were recorded in those areas.
  • Suggestions were made to improve the flow of business processes with the business team, resulting in a better application using clean and latest NSW data for all upcoming monthly data uploads.
  • Suggestions to improve the infrastructure of the application to avoid downtime, e.g.,
    • Traffic Manager and File Configuration Server should have at least 2 instances
    • Data types and lengths should be consistent across different environments
  • An effective bug tracking process ensured that open bugs could be easily tracked and resolved.

TECHNOLOGY STACK

  • Microsoft Excel – Test case creation and execution
  • SQL Developer – Building and execution of SQL queries
  • SSMS (SQL Server Management Studio) 2014 – Data verification and validation
  • GME (Google Map Engine) – Spatial data verification and validation
  • ArcGIS Server 10.2, ArcCatalog – Data verification and validation
  • FME Workbench 2015 – Data validation and ETL logic testing
  • Mantis – Defect management and reporting

ASSISTANCE PROVIDED BY CLIENT RESOURCES

  • Assistance provided by client SMEs in understanding business requirements and rules for all ETL logics
  • Client SMEs conducted knowledge sharing sessions on different modules of data transformation
  • Client’s IT team helped in accessing tools like FME, GME, Azure, SSMS 2014, ArcGIS 10.2, and cloud environments
  • Client’s development team assisted in bug fixing
  • Client’s Project Management and Test Management teams supported test coordination with business users and development team

WEB CONTENT ACCESSIBILITY GUIDELINES (WCAG) 2.1 AA Testing for State Government Department

BUSINESS SITUATION

Client is a major State Government Department which launched a notification portal. As per legislation, the portal had to comply with WCAG 2.0 AA compliance. Multiple device and OS combinations were required to be tested

SOLUTION

What Was Developed: Accessibility Testing Approach

  1. Use automated tools that reference WCAG 2.0 AA standards
  2. Manually check results of automated tools
  3. Manual tests for iOS and Android accessibility compliance standards
  4. On-Device Assistive Technology – To ensure real-world accessibility, we include an assessment of the app using inbuilt Assistive Technology (VoiceOver, TalkBack, Zoom) to identify practical accessibility issues
  5. Test with a screen reader
  6. Testing across various browser combinations

KEY TOOLS USED

  • Code inspection and validation tools – Tenon, A-Checker, WAVE
  • Screen reader software – JAWS Screenreader and NVDA
  • Browser tools – IE WAT and Web Developer Toolbar (IE, Firefox, and Chrome)
  • iOS – iOS Accessibility Scanner, VoiceOver for iOS, and iOS Color Contrast Checker
  • Android – Accessibility Scanner, TalkBack for Android

BENEFITS

Number of accessibility issues found as per below:

  • 15 accessibility issues found during penetration testing
  • Client improved its compliance from initial 85% to 98% after fixes
  • Provided accessibility standards training to the client development team
  • Reviewed and provided recommendations on client user interface design

TECHNOLOGY STACK

Test Results after Initial Assessment

  • ASP.Net
  • AWS
  • WebAPIs
  • SQL Server
  • AWS Lambda
  • Dockers
  • Node.js
  • GitLab

RPA – Medical Billing and coding process

CHALLENGE

This client has a big staff of doctors, nurses, and medical assistants dedicated to wound care practices service. They provide their services to patients in their homes and residents in long-term, subacute, and supportive living facilities. They serve approximately 100 of these facilities throughout Australia and treat 1,500 patients weekly.

Fast and accurate payment by Medicare is an important process aspect for all healthcare providers. Manual analysis process of clinical documents—charts and notes—and code determination being related to a particular case were very tedious and time-consuming. Or this process had to be handled manually by a devoted and experienced programmer.

Over 50,000 codes fall into different categories to generate proper invoices daily. The changes associated with such complexity, inaccuracies in medical coding, delay the overall operation and also lead to improper payments.

This customer wanted to eliminate potential loss of revenue from inaccurate coding. Moreover, from a revenue cycle management perspective, it was crucial that the coding for billing was done quickly and with 100% accuracy to avoid rejection or non-payment of claims.

APPROACH

Adactin implemented RPA Bots to automate hundreds of charts each week easily and efficiently. With native machine learning abilities, bots understand, analyse, and create wound and prescription documentation for each patient and generate a billing code that produces a suitable invoice.

The billing/payment process is accomplished via an external billing company, a dedicated system, or managed in-house. This automation reduces the risk of manual errors and improves the efficiency of the entire process.

As a result, the medical staff can focus more on patient care and more complex coding scenarios.

OUTCOMES & BENEFITS

  1. Manual Work Reduced by 95%
  2. Coding Accuracy Improved to 90%
  3. Shift Time Process Reduced by 80%

Implementation of NPP for on-us and off-us transactions for one of the big four Banks

CLIENT PROFILE

Client is an Australian bank and financial-services provider. It is one of Australia’s “big four” banks.

As of March 2018, the client has 14 million customers and employs almost 40,000 people, with a vision to be one of the world’s great service companies, helping customers, communities, and people to prosper and grow.

Client provides a broad range of consumer, business, and institutional banking, and wealth management services through a portfolio of financial services brands and businesses.

NPP OVERVIEW

The New Payments Platform is a world-class payments infrastructure for the Australian economy. It gives consumers, businesses, and government departments a platform to make fast, versatile, and data-rich payments to meet the evolving needs of a 24/7 digital economy.

It’s a platform that enables real-time clearing and settlement for simple or complex payment solutions, between two people or between many. It can simplify payments through an Addressing Service, called PayID, as well as offer the ability to include more information with payments, such as text or links to externally hosted documents.

While the platform has been collaboratively developed by NPP Australia Ltd and 13 financial institutions, a large number of additional financial institutions connect to the Platform through one of these initial participants.

The New Payments Platform enables innovators to develop and offer an overlay service or product, helping more consumers and businesses to realize the benefits of faster, data-rich payments and the PayID simple addressing service.

The first innovative product delivered via the New Payments Platform is Osko.

BUSINESS SITUATION

Adactin was involved in customization of a popular product used in the payments industry for initiating Payments, developed in an Agile (Scrum) environment with a three-week sprint cycle. The objective was to implement real-time payments, also known as Immediate Payments, on top of the existing system to incorporate the requirements and business needs related to NPP/Osko Payments for Retail and Corporate Channels, covering both on-us and off-us transactions.

SOLUTION

Adactin proposed and implemented solutions for NPP/Osko payments for the end user client, i.e., Retail as well as Corporate.

Phase 1 – Interacting with the Business and performed Requirement Analysis

  • The Adactin team comprising Developers spent time with Business Analysts (BA) and Subject Matter Experts (SME) to understand the system, end-to-end business processes, and requirements of the application.
  • Coordinated and collaborated with cross-functional business users, engineers, and Business Analyst to discuss the design, requirements, and get approval to achieve an elegant solution.

Phase 2 – Development of the Requirements

  • Demonstrated a proof of concept covering one key scenario.
  • Worked on enhancements and defect fixing for the recently launched NPP (Osko Payments).
  • Worked on developing and designing of solutions related to Osko Payments for Corporate Online Channel for on-us and off-us transactions.
  • Researched and resolved reported system problems efficiently and accurately while adhering to internal software management standards and procedures.
  • Provided ongoing maintenance support and enhancement in the existing system and platform.

Phase 3 – Performed Unit Testing for the Developed Modules

  • Tested on-us and off-us transactions using Asset Simulator for verifying transactions.
  • Liaised with the testing team, infrastructure team, and Technical Architect for the fixes done and analysis of the results with the changes in configuration.

Phase 4 – Documentation and Handover

  • Versioned the scripts in SVN and handed over the documents and reports in a shared repository.
  • Prepared Release Notes and Traceability Matrix summarizing the configurations done and files modified.

BENEFITS

Find below benefits of the technical solution proposed to the client:

  • Opportunity to reduce costs – through aggregated model. Offers both Single Credit Transfers (SCT) and BPAY’s Osko Overlay Service 1 (Payments) at industry launch.
  • PayID registrations and updates
  • Real-time payments clearing and settlement processing
  • Industry standard gateway solution, a global financial services vendor
  • The solution enables banks to thrive in with latest API’s based on the ISO 20022 messaging standard and instant payments processing.
  • Fully hosted case management solution with operational and technical support available.
  • Control flexible solutions to connect, create and control customers’ user experience.

TECHNOLOGY STACK

  • OS – Windows 2007
  • Database – MySQL
  • Language – Groovy
  • Webservice – SOAP
  • Tools – Configuration Builder, Platform Manager, Asset, Jira, Confluence, and SVN

ASSISTANCE PROVIDED BY CLIENT RESOURCES

  • Assistance provided by client’s cross-functional team in understanding the existing system and requirements.
  • Knowledge transfer on application workflows

Performance Testing for a State Government Department Client

PROBLEM STATEMENT

Client is a major State Government Department which has planned to transition their existing SAP instance database from DB2 to SAP Hana. As a part of this transition, it was pertinent to test the performance of the system to ensure non-functional requirements are met and system is stable under peak transaction and user load.

OBJECTIVES

Objectives of this Performance Testing exercise include:

  • Determine whether the to-be SAP system is capable of supporting the target number of concurrent users (400) on SAP FIORI portal while retaining its functional stability and responding within pre-defined guidelines.
  • Determine whether the system is capable of supporting constant user load over a long duration of time.
  • Perform Baseline Testing with 50% load on current SAP Environment.
  • Determine whether the system is capable to support 200% stress load (i.e., 800 SAP FIORI portal users) while retaining its functional stability and responding within pre-defined guidelines.
  • Determine whether the system is capable of supporting the target transaction rate while retaining its functional stability and responding within pre-defined guidelines.
  • Identify any bottlenecks preventing the achievement of performance targets and provide recommendation for the resolution.

SOLUTION

What Was Developed:

Performance Testing Approach

The following types of performance testing were conducted as part of this project:

  • Shake-out Testing
  • Baseline/Benchmark Testing
  • Load Testing
  • Stress Testing
  • Soak Testing

During testing, the following aspects of system performance were monitored to ensure the system meets basic performance criteria:

  • Load: The volume of work that the system is processing, expressed in terms of concurrent users and transaction rate.
  • Response Time: The time it takes the servers to respond to client requests.
  • Error Rate: The number of errors as a factor of the total number of transactions.
  • Server Resource Consumption: The infrastructure utilisation while under load.
  • Stability: The system’s ability to behave in a consistent manner.

TECHNOLOGY STACK

Key Tool used:

Micro Focus Load Runner

EXECUTIVE SUMMARY

Overall Findings from Performance Testing

Conclusions:

  • System was found to be capable of supporting 200 and 400 concurrent users in the AZURE environment. During the load test, the response times for all key transactions were within acceptable SLA’s in this environment.
  • There is massive improvement in transaction response time performance in AZURE environment as compared to On-Premise traditional environment.
  • Server parameters (Memory, Disk, Network) in AZURE were always within acceptable limit for 200 users and 400 users test.
  • System was stable and capable of support 400 users load test (in AZURE) with payroll run and batch job executed in the background with no major degradation in response times.
  • System was stable and capable of support 800 users stress test.
  • One key transaction took more than 5 seconds during 400 users (in AZURE) with batch jobs and payroll run
  • Spike in CPU utilisation was observed during Soak testing when payroll run was executed in parallel.
  • One Key transaction and few internal transactions took more than 5 secs during stress test which needs further investigation.
  • Spike in CPU utilisation was observed during Stress testing which needs further investigation.

OUTCOMES

  • Major performance improvement noticed after migration from DB2 in on-premise environment toSAP Hana in Azure.
  • No specific recommendation required for the AZURE environment for 200 users (50% load) and400 users (100% load) tests since the system was able to handle concurrent users with response time within acceptable SLA’s.
  • Further investigation was needed for increase in transaction response time for one key transaction and a few internal transactions for during Stress test (800 users). Also, there was increase CPU utilization observed during Stress test which needed investigation. After fixing a few configuration settings on the SAP server and DB queries the issues were re-tested and resolved.