Last fall, the Gartner analyst firm predicted that through 2015, 75 percent of mobile applications would fail basic tests related to security and enterprise policy.
A separate survey from Frost & Sullivan of 300 enterprises found that 83 percent have at least one mobile app for employees to use on their devices, with roughly one-in-three having 11 or more.
Both these surveys underscore a basic reality for IT – the adoption of mobile apps has made secure development practices critical.
“Mobile application security is one of the fastest growing problem areas for developers and ultimately C-Level executives today,” said Theodora Titonis, vice president of mobile at Veracode. “Other layers of security, such as perimeter, network, and even data center, have established solutions in place to protect enterprises. However we’re still living in the infancy of mobile application security. Right now, the coding lifecycle is still a bit Wild West, but we are seeing many mobile application developers institute best practices to produce more secure applications.”
In discussions with SecurityWeek, experts laid out some of the common security and privacy issues in mobile applications today.
EXCESSIVE PERMISSIONS
“Most of the mobile applications residing on the mobile devices have more access permissions and privileges than are required on the device,” said Sameer Dixit, a director on the SpiderLabs team at Trustwave. “These permissions might include access to user’s contact list and the ability to update the contact list without notifying user, receiving & sending SMS messages, location (recording user’s GPS coordinates), and access to other device hardware components such as the camera, microphone etc.”
Application permissions should be limited to only the necessary components required for the functionality of the application, he said. Resist the urge to collect permissions that might be used at a future date, he added.
TLS FAIL
Mobile devices operate extensively on open WiFi networks where anyone in range can listen in on traffic intended for other users, said Cap Diebel, manager of application security at Denim Group.
“Most developers in this space understand the importance of transport layer security (encrypting data in-transit) for this reason,” Diebel said. “However, implementing this properly is more challenging for mobile developers than for those who develop web applications used by standard browsers. Mobile developers often neglect to ensure their application uses HTTPS rather than the openly-readable HTTP. Among developers who do try to implement transport layer security, many will code their application to ignore certificate verification to work around the complications in setting up a proper test environment. Objective-C allows developers to turn off these checks in one simple line of code: client.allowsInvalidSSLCertificate = YES;.”
“Certificate verification allows the mobile application to ensure it is communicating with the intended web services and not a malicious proxy,” he said. “Turning off this verification allows an attacker an ‘in’ on this sensitive traffic, especially if that user is operating in a malicious or compromised network.”
PROTECT THE KEY
“Practices such as hard-coding a key directly into a mobile application can be problematic,” Titonis told SecurityWeek. “Should these keys be compromised, any security mechanisms that depend on the privacy of the keys are rendered ineffective. It must be assumed that keys stored in the mobile app and/or on the mobile device will be compromised due to the possibility of rooting or jail breaking a device. If the developer stores the key on the device it is an incomplete and insecure solution. If the developer prompts for credentials to generate the key whenever it is needed, there will be a degraded user experience.”
The on-device key storage problems go away if there is a trusted and protected connection to a secure server that asks for the key every time the app performs encryption or decryption – though the app may become unusable if there is slow network connectivity, she added.
As a solution, developers should look to OAuth 2.0 to eliminate the need to store keys on the device, she said. The advantage of this approach, she added, is that if an outsider gets access to the device or records the token via a man-in-the-middle attack, they only receive a restrictive token that is usable for certain use cases, such as viewing specific content. Tokens also have an added advantage in that they can be revoked and are only valid for a certain amount of time.
PROTECTING PRIVACY
Developers also sometimes make the mistake of failing to encrypt application project resources, settings and preference files like .plist and .xml files, noted Trustwave’s Dixit.
“These files can contain sensitive information such as the last logged in user, address, usernames, session tokens, device UDID, etc,” said Dixit. “This is not only limited to just project resource or settings files. Even the photos, videos taken by the application, user locations (GPS) and other user sensitive information stored by the applications locally on the device in various image, media and SQLite database files are also often found to be unencrypted.”
APPLICATION LOGIC FLAWS
App logic flaws are another common problem. One example, Dixit said, is when a user hits logout and the session is terminated only locally on the client side by either updating a local resource file on the device or in the worst cases simply forcing the user to the application login screen without actually terminating the session at the server end.
“Successful exploitation of this vulnerability would allow an attacker to obtain functional access to victim’s account,” he said. “An attacker can therefore impersonate the victim and misuse the account. Other commonly observed flaws are privilege escalation vulnerabilities, and direct object references, authentication bypass and lack of lockout after a number of unsuccessful tries. This leaves an application susceptible to variety of brute force attacks.”
Developers should always invalidate the session after logout both at the client and at the server side, and terminate inactive sessions that have not been active after a reasonable amount of time, he said. Users should also have to re-authenticate when requests attempt to access a privileged function using a terminated session.
“As revealed in our 2014 Trustwave Global Security Report, 96 percent of applications we scanned in 2013 harbored one or more serious security vulnerability,” he added. “Too often, application developers overlook security when building apps for mobile. It’s critical that application developers perform frequent vulnerability scanning and in-depth penetration testing so that they are continuously identifying and remediating security weaknesses during the development, production and active phases of applications. They need to make sure security is built in and not bolted on.”