Security Experts:

What Sort of Testing Do My Applications Need?

Testing Code of Applications for vulnerabilities

As you start to get an idea of what your application portfolio looks like, you then need to start determining the specific risks that applications can expose your organization to. This is typically done through application security testing – identifying vulnerabilities in an application so that you can make risk-based decisions about mitigation and resolution.

The challenge lies in the fact that there is no “one size fits all” approach to application security testing. You cannot constantly perform exhausting testing on all applications - you simply will not have the resources. And, you will be limited in the types of testing you can do based on the type, language, and framework of the application, as well as the availability of source code. To most effectively begin the application security testing process, you need to determine the depth of testing you want to accomplish. 

The risk associated with the specific application is typically where you go to determine the necessary depth of testing. Determining risk means asking questions such as: What data does the application manage? How much would a breach cost? What sort of service level agreements do you have with stakeholders? When planning for the depth of testing, it can be helpful to look at the specific assurances you would like to have about the application. Does it properly handle inputs that are passed to database SQL queries? Does it perform sufficient authorization checking in situations where the application accesses sensitive resources?

A valuable resource in looking at enumerating these concerns is the OWASP Application Security Verification Standard (ASVS). The OWASP ASVS provides three levels of assurance that can be applied to an application – Opportunistic, Standard, and Advanced – and provides specific guidelines of what analysis should be performed for verifications at each level. The OWASP ASVS might not be a complete fit for all organizations and all aspects of their application portfolios, but it can provide a valuable starting point for organizations to use as a basis for modification and customization. 

The process of crafting a testing plan for an application also involves enumerating the security concerns to be tested for, and then layering testing activities to provide coverage for those concerns.

As an example, a common goal in application security testing is to identify that the application properly handles inputs that are passed to databases in order to be free of SQL injection vulnerabilities. One way to accomplish this goal would be to use automated static analysis to look for SQL injection patterns in code. Another way to achieve this would be to use automated dynamic analysis to look for evidence of vulnerable behaviors.

When crafting a testing plan for an application, either of these techniques might be selected in order to fulfill that testing requirement – depending on various characteristics of the application and the toolsets available to the organization.

Application security testing approaches can broadly be divided into two camps – automated and manual.

The automation approach typically requires setup and configuration that is performed by humans, but once this has been accomplished, subsequent testing runs can be done in an unattended manner, though humans are required to evaluate results to identify potential false positives.

Manual application security testing involves humans either reviewing static application artifacts such as source code or application binaries, or exercising running applications. Manual testing can find entire classes of vulnerabilities that are out of the reach of automated testing, but due to the requirement for human involvement, can be much more expensive and incurs material incremental costs every time it is run.

Some examples that can provide insight into possible thought processes when evaluating testing strategies for specific applications are included below:

• High-risk web application developed in house: Because you have access to the source code and both SAST and DAST tools that are well-suited to testing the application, use a combination of static and dynamic testing with both manual and automated components. Also consider annual 3rd party manual assessments to get an external opinion and access to testing techniques your in-house team might not have mastery of.

• Lower-risk applications developed in house: Again, because you have access to both the source code and running environments, use either automated static or dynamic testing prior to each new release.

• Mid-risk applications developed by third party: Due to the fact that you don’t have access to source code, but do have an internal pre-production environment, rely on vendor maturity representation and an annual dynamic assessment performed by a trusted 3rd party.

• Developed by large packaged software vendor: Given that you currently have no budget to do independent in-depth research, rely on vendor vulnerability reports and use patch management practices to address risk.

My next column will look in more detail at using static application security testing (SAST) and dynamic application security testing (DAST) as two common approaches to automated application security testing. These will be important techniques when evaluating the applicability and likely effectiveness of various testing approach options. 

Related Reading: SAST and DAST: Part of a Balanced Software Security Initiative

view counter
As Chief Technology Officer and Principal at Denim Group, Dan Cornell leads the technology team to help Fortune 500 companies and government organizations integrate security throughout the development process. Prior to Denim Group, he served as the CTO of BrandDefense, architecting and developing their cutting-edge intellectual property protection technologies. Additionally, he previously served as the Vice President, Global Competency Leader for Rare Medium's Java and UNIX competency center, was a founder and VP of Engineering for Atension, Inc. and developed simulation applications for the Air Force with Southwest Research Institute.