By Hannah Bernstein
Google selected two Khoury research projects from Associate Professors Christo Wilson and David Choffnes and Assistant Professor Long Lu, for the company’s inaugural ASPIRE program. The pilot initiative is designed to fund research into mobile device security and privacy issues and increase collaboration between university researchers and Google.
Programs of this nature are designed to jumpstart research by providing research funding for exploring a new topic that could use a boost to get underway, reports Lu. His project was one of two at Northeastern selected for funding.
“This is Google’s inaugural effort of sponsoring research in universities in the field of mobile security,” Lu says. “Their goal is to solicit research ideas that aim to solve an open security problem with mobile devices or apps.”
Lu proposed a research project to solve a problem where apps can abuse user-granted permissions for hidden or malicious purposes, such as a weather app asking to use location data not only to show the local forecast but also to sell it to advertisers. The problem, Lu explains, is that once users give an app permission to use their data, no technology is in place to control how the data is actually used.
“After apps obtain such data, how they use such data or how they propagate such data is not regulated by the mobile operating systems like Android or iOS,” Lu said. “It’s not because people don’t want to do it, it’s just because it’s a really hard problem to solve.”
Finding a solution is challenging because many possible fixes could make phones much slower. Lu’s proposal, which he believes won’t have that problem, would create a security mechanism to regulate how apps use data based on automatically verifiable contracts agreed by app developers.
In another lab on campus, Associate Professors David Choffnes and Christo Wilson also received Google funding to investigate a different strategy used by apps to gain private information about their users. Choffnes and Wilson’s research will focus on identifying and understanding dark patterns — an ominous term for user interfaces that manipulate consumers into making decisions against their interests.
“These are things like having a dialogue that pops up and says, ‘Are you okay with giving us permission to track everything you do on your phone and sell the data to other parties?’ and a big ‘Okay,’ and then maybe in very small text, ‘No, I want to see more details,’ to opt out,” Choffnes said.
Essentially, privacy dark patterns are the pesky, manipulative ways websites and apps make it difficult to protect your privacy and very easy to give it away. Dialogue pop-up boxes with confusing messages, hidden settings with data collection turned on by default, and unreadable privacy policies are just a few examples.
The project begins with a manual analysis of these strategies to understand commonly used dark patterns, the ways users view them, and their harms on user privacy. Finding those answers, Choffnes says, will hopefully help create both technical and policy solutions. Eventually, he adds, they hope to create an automated process to identify dark patterns, help users control their privacy, and recommend regulations to protect consumers.
“This project is really exciting because it gives us an opportunity to have high impact on consumer protection by breaking out of our disciplinary silos,” Choffnes said. “We’re really excited about this opportunity to take some of our extensive work on privacy done from the computer science side, and start marrying that with policy and law.”