Project – GSA

United States General Services Administration


Consolidated Security Scan Findings Database and Workflow Management Tools Suite 

CTA provided data engineering and application development services to the United States General Services Administration (GSA) to design and implement a custom web-based process support application and data platform to collect and standardize system security scan data and to track and manage remediation activities.



The GSA Federal Acquisition Service (FAS) Information Security (InfoSec) group is responsible for monitoring the security of internal GSA servers, databases, and web applications. InfoSec security engineers perform regular vulnerability scans on key GSA systems, analyze any discovered vulnerabilities, and coordinate remediation activities. Security scans of the various systems are conducted using several third-party scanning tools, which generate logging of all discovered vulnerabilities. The output from these scans can contain thousands of individual vulnerabilities ranging in importance from critical to insignificant, with some percentage of these vulnerabilities being false positives. The structure and format of the scan data was unique to each vendor and scanning tool. Converting the scan findings into a trackable and reportable form required a time-intensive manual process to triage, analyze, assign, update, and close each finding. Meeting formal reporting requirements involved a further layer of manual cut-and-paste effort, which delayed communication and remediation of important security vulnerabilities.

The primary driver for this effort was the need to consolidate scan data to a common platform and data structure for the purpose of supporting rapid review of a large volume of findings to determine appropriate action and quick assembly of standardized reports to meet internal and external reporting obligations.


Solution Approach

CTA’s approach to FAS InfoSec’s challenge was to design and implement a custom data platform to create a single common data structure for all scan data and a web-based application to collect and manage scan data and provide access to required reports. System design focused on four primary objectives…

  • Creation of a standardized data structure to merge tool-specific scan output
  • Automation of scan data input and standardization
  • Creation of web-based process support tools to facilitate vulnerability analysis and management of remediation efforts
  • Automation of compliant report output


Objective #1: Common Data Structure

Scan data collected by each scanning tool was stored in a unique format, according to a unique data structure making data from each tool functionally incompatible with data from each of the other tools. A common structure to describe and store the information to represent each security vulnerability was required to support centralized storage and management of scan findings. To standardize scan data from multiple vendors and scanning tools the underlying information was analyzed and a custom data structure was constructed to hold all required data elements. The output from each scanning tool was then mapped to the common structure. This allowed scan data from all tools to be represented and stored in a single database with unified labels and relationships.


Objective #2: Automated Data Input

A simple, automated mechanism was needed to move scanning tool data into the central system and to translate the data to the standardized data structure. A main design goal was to create a mechanism that required minimal manual processing by the user. The approach the team took was to construct a custom extract, transform, and load (ETL) mechanism that would accept an existing export format from each scanning tool and extract all relevant data for storage in the new database. Each of the scanning tools in use provided native export to XML file output, and a custom set of mappings was defined for each scanning tool to translate tool-specific XML output to the standardized data structure. This approach allowed for specific handling for each scanning tool while still providing a generic interface to accommodate new scanning tools in the future. A generic plug-in style interface was created, with custom mapping sets that defined the translation of data elements unique to each scanning tool. To import scan data into the application, users simply exported the scan results from the scanning tool to the appropriate format and uploaded the exported file into the new tracking application. At that point the ETL mechanism processed the uploaded file to transform all scan data and load it into the application database.


Objective #3: Process Support Tools

The primary purpose of performing scans and collecting finding data is to support rapid identification and remediation of critical system security vulnerabilities. Given this purpose, the primary functional value of the tracking application ultimately was facilitation of the triage and tracking activities. Having scan data consolidated and standardized enabled the development team to design and implement a set of user-friendly tools that allowed the users to search and view findings across multiple GSA systems and scans based on a wide variety of searchable properties. The capability to quickly isolate and act on a list of scan findings allowed FAS security engineers to review large volumes of data to quickly eliminate false positives and non-actionable items and to tag significant findings for further action. Once identified, progress on each high-priority finding could be tracked using properties such as due dates, assigned resources, and mitigation plan steps. The application also captured notations and other documentation to build a history for each finding, and supported quick identification of late findings not addressed within the specified window of time. Selection of a web-based application platform kept system and development costs low and the pace of feature deployment high while eliminating the need for user-by-user end point installations and upgrades. Changes to the application were deployed to the web server and were immediately available to all users.


Objective #4: Reporting Platform

The FAS InfoSec group was required to support federally mandated FISMA-compliant reporting intervals and formats. These reports were based on pre-defined layouts, which were reproduced in the new tracking application. Prior to this project, these reports were assembled by cutting and pasting text from scanning tool-generated Adobe Acrobat files, or other sources, so a primary design goal was to minimize the effort required of the user to create the required report output. Because of the standardized data structure, the reports team was able to leverage the Reporting Services platform included as part of the Microsoft SQL Server database software to provide on-demand reporting capability. This approach saved the cost of a separate reporting product and allowed the team to rapidly build a variety of report formats without the need for additional application coding, and also provided capability to export reports to a variety of user-friendly formats, such as Microsoft Excel and Adobe Acrobat. Using this reporting solution, users were able to run up-to-date reports in all required formats on an as-needed basis and easily export those reports for sharing with downstream recipients saving hours of manual cut and paste effort.

Protect cloud information data concept. Security and safety of cloud computing.