Security Experts:

Common Approaches to Automated Application Security Testing - SAST and DAST

Not All Automated Software Security Assessment Approaches Are Created Equal

When planning a testing strategy for an application, it is important to evaluate the applicability and likely effectiveness of the various testing approach options. The two most common approaches to automated application security testing are static application security testing (SAST) and dynamic application security testing (DAST). 

SAST involves testing application artifacts – such as source code or application binaries – at rest. The testing tool reviews the application source or binary to build a model, and then performs various types of analyses such as semantic, data flow, or control flow to identify potential vulnerabilities based on a set of rules tuned for the type of application, language, and application framework in use. DAST involves testing a running application by sending inputs and analyzing the application’s responses to see if those request/response patterns indicate the presence of security vulnerabilities.

Dynamic and Static Application Security TestingThere are several concerns to take into account when evaluating SAST. The first potential roadblock would be not having access to the application source code. Often, this is available for software that has been developed in-house however, in some cases internal security controls or organizational politics will make it difficult, if not impossible, for you to get access to the source code. Additionally, if you are planning to test third party software it is likely that you will not have access to source code unless you have negotiated this access. With no source code access, static analysis is likely off the table. While there are some tools and providers that enable security static analysis based on an application binary, they often require binaries compiled with debug symbols or other compiler settings, which might be similarly difficult to acquire for third party software. 

A second potential roadblock could be if your static analysis tool doesn’t have support for the application’s language. If you have an application written in the Go language, but your static analysis tool only supports Java and .NET, then that tool is not going to be useful for testing the target application. Most static analysis tools have solid support for common enterprise development languages such as Java and .NET, but are either missing support or have less mature support for other languages. Similarly, you must also consider if your static analysis tool has support for the application type. Typically, these tools have two major components – analyzers that build a model of the application to test, and rules that are applied to the model during testing. Most commercial static analysis tools have support for web applications, and many have support for mobile applications, but if you are looking to test IoT applications you need to consider how mature the rule set is for the application in question. Missing or less mature rule sets will result in less effective testing. 

The final potential roadblock will be if your static analysis tool doesn’t have support for any application frameworks in use. This is most frequently an issue for web applications but it can be applied for other application types as well. If you have a web application that is written in Java using the Spring framework, it is also important to understand if the static analysis tool has the ability to detect the use of important Spring constructs, as well as rules for detecting security issues based on the way that Spring handles user requests. 

Just as the evaluation of SAST tools can uncover potential issues in testing, looking at dynamic application security testing (DAST) can present roadblocks as well. Most commercial DAST tools are intended to be used to test web applications, thus, you must confirm that the application in question is in fact a web application. There are both open source and commercial fuzzing tools and frameworks that can be used to test other types of applications – such as server daemons – but the applicability of these techniques and the level of expertise required to take advantage of them may make them less attractive options when starting to craft a testing program. 

Assuming the application is web-based, you must ask if your DAST tool supports testing the target application. Not all DAST tools are capable of testing APIs or web applications written using different web application development approaches such as single page applications (SPAs). Critically, you must also determine if you have permission to test the system in question. For software developed in-house this is hopefully straightforward. You can either use a production instance or – often better – a pre-production version of the application for security testing. However, for software developed or hosted by suppliers, partners, or other third parties, testing access can be more complicated. It is best to negotiate rights for security testing, as well as requirements for remediation, during the application acquisition process as this can be hard to do after the fact. Understanding your ability to test these systems will be critical in determining what, if any, testing approach can be taken.

Not all automated assessment approaches are created equal. When developing an automated testing strategy for an application, it is critical to match the testing approach and testing tool to the characteristics of the target application. A failure to do this properly will result in ineffective or less effective testing, but properly matching automated testing tools to applications can provide you valuable insight into the applications’ security.
view counter
As Chief Technology Officer and Principal at Denim Group, Dan Cornell leads the technology team to help Fortune 500 companies and government organizations integrate security throughout the development process. Prior to Denim Group, he served as the CTO of BrandDefense, architecting and developing their cutting-edge intellectual property protection technologies. Additionally, he previously served as the Vice President, Global Competency Leader for Rare Medium's Java and UNIX competency center, was a founder and VP of Engineering for Atension, Inc. and developed simulation applications for the Air Force with Southwest Research Institute.