In May, Nielsen released a study that found more the number of apps per smartphone in the U.S. jumped 28 percent between 2011 and 2012. Assuming this trend does not suddenly end, mobile app developers should expect to be busy.
But it also means mobile apps are going to continue to gain attention from attackers. In a report in August, researchers at Arxan Technologies noted that 92 percent of the Top 100 paid iOS applications and 100 percent of the Top 100 paid Google Android applications have been hacked.
An important step in addressing this issue is to stop attackers from reverse-engineering applications. Fortunately, there are a number of steps developers can take to keep their work from being tampered with, said Kevin McNamee, director and security architect at Kindsight Security Labs. For one, they can obfuscate the Java byte code to prevent attackers from figuring out what the code does.
"This also prevents attackers from easily modifying the code to build hijacked versions or inject malware into the application," he said. "[Android] developers can use ProGuard, which is a free obfuscator, or DexGuard, which has a license fee but is specifically designed for Android."
In addition, developers can distribute key aspects of the application as libraries to protect algorithms and other intellectual property and encrypt important configuration information to prevent tampering, he added.
Just as hackers analyze code for weaknesses, developers need to analyze code they are embedding into their application as they are accepting and passing through any risk in that code to end users, said Tyler Shields, senior security researcher at Veracode.
"Double and triple check your permissions," he said. Application developers use third parties’ libraries in code to speed up the creative process. Code reuse is not only common, but one of the main tenants of being an effective and efficient developer. However, there are problems that come with code reuse, including knowing exactly what the code you are reusing does."
In addition, Shields noted, third-party code can attempt to leverage any extraneous permissions that may have made its way into the application.
"Moral of the story: limit your permissions to only those required for the operation of your application and check the security of any piece of third party code that you embed into your application," he said.
Playing fast and loose with loopholes in their permission model can hurt developers in the long run, noted Domingo Guerra, president and co-founder of Appthority.
"Time and time again, we've seen apps that find a way to circumvent the permission model set forth by Apple or Google," he explained to SecurityWeek. "For example, apps are required to ask a user's permission before accessing their location if using the device GPS. However, some developers have found that if the device GPS isn't used, an app can still track location by using third-party APIs or GeoIP tracking without asking the user for permission. This is just bad form and bad practice, if you need user data, ask for it (and let your users decide). Your reputation as a developer is important."
But, according to Shields, one of the major mistakes developers often make isn't even technical.
"One of the biggest mistakes mobile application developers make is thinking they’re done learning after they graduate," he said. "Only a limited amount of time is spent educating developers on security-related subjects while in school, such as how to build security measures into their mobile applications…To counter this, I’d suggest developers participate in yearly security awareness sessions offered by top security companies or seek out additional security training at a local university."