A collaboration with Stanford’s d.School and Apple’s Legal Team. Our goal was to create an interactive prototype for a native privacy application to educate and engage users about Apple’s privacy policies.
MY ROLE + DELIVERABLES
Design Mentor and Interactive Prototype Designer. We delivered an Interactive Prototype of concept to Apple’s Legal Team.
Survey Design, Usability Testing, Remote Team Collaboration, Sketching, Team Brainstorming, Rapid Prototyping, Interactive Prototyping
Mechanical Turk, Qualtrix, Marvel app, Proto.io, Adobe Illustrator, Adobe Photoshop
Now that’s a big problem. I was one of three designers invited by Margaret Hagan, d.Lab lecturer at Stanford, to mentor law students through the user centered design process. This d.Lab partnered with Apple’s Legal Team to redesign their Privacy Policies.
Our goal was to integrate Apple’s Privacy Policies, Website Privacy Policies, and Device Settings. We focused on super users that were tech savvy and understood privacy tradeoffs. Less than 10% of users spend two or more minutes reading Terms of Service Agreements and we wanted to create a more engaging experience with their privacy settings.
Our Design Question: How might we help a 27 year old tech-smart art student feel a sense of understanding and control over how her personal data is used by Apple so that she views Apple’s privacy behaviors in a positive light and increases her trust of and loyalty to the company.
My role: I created the final interactive prototype for the team utilizing their research findings. Currently the Privacy Settings are located on the Apple device in the “Settings” application. Our design solution was to create a native app in the Apple ecosystem that included the following three features:
- Improves transparency
- Offers multiple options for customization
- Includes social feature
Apple goes to great lengths to protect users privacy, but our research revealed this is not immediately apparent to users. One way Apple protect’s users’ privacy is by storing most of their data on locally on the devices, which limits the amount data stored on servers. Our application highlights this by including informative text that explicitly notifies users what benefit is received by enabling the feature, which data is being collected, and where the data is stored. These illustrate how we improved transparency.
Feedback from users
I liked the ‘where it is” feature A LOT. The ‘cloud’ is not an acceptable answer to me anymore. When news breaks about the latest hack, I want to know where my stuff
was, not wait for a letter saying it was my data.
I don’t know what “aggregated interactions” means. Nor “interactions” – isn’t everything an interacation? Also unclear on the difference between “Personal Content” and “Personal Information”.
I like the Spotlight Suggestions. This information is clear and useful. This screen makes the information about location identification easier to understand.
This was my favorite part of the app. Clearly described and I felt confident that I was getting info I haven’t had before and that would help me make an informed decision.
One of our main ideas we hoped to implement was the ability to select from a variety of privacy settings or completely customize the device settings. Our original idea was to create three profiles:
- Extreme privacy (disables most applications)
- Extreme sharing (enables most applications)
- Custom (allows more tech savvy users to individually customize all their settings)
After testing these ideas, we discovered, that more tech savvy people weren’t interested in the extreme privacy or extreme sharing. We also found it difficult to use neutral language for the setting names and descriptions. Additional user testing, such as a card sort, may have helped resolve this problem, but we were extremely limited on time. We chose to simplify the customize feature to one profile, and include a social feature to provide additional profiles.
Feedback from users
Select Customized… seemed to be intentionally misleading, like it would choose what I would see, but by using the trust term was hoping I would just “trust” the software to choose what’s important
I like to limit any personally identifiable information. This would be useful especially if I could easily disable a specific website or application for individual transactions like purchasing.
Crowdsourcing, user-generated content, and social networking have become important methods for establishing credibility with users. We wanted to explore how we could leverage the power of the collective consciousness in the privacy realm. There were three important aspects to our social feature:
- Including Privacy “Experts”
- Ability to share your settings with family and friends
- Illustrate popularity of settings
We included privacy profiles of experts like Edward Snowden the Electronic Frontier Foundation to demonstrate that a user could learn about the person/organization and easily adopt their recommended settings.
One use case we strongly considered was the tech savvy child of less tech savvy parents. We designed the ability to share settings with in-network family network, like parents or grandparents, who may not take the time to read or understand all the settings. This feature would enable the less tech savvy person to easily select the settings of their family member without having to go through all the individual settings.
The last social feature we included on the settings information pages was a statement showing how many people in the user’s network had adopted the setting. The pictures below demonstrate the social feature more in depth.
Feedback from users
I thought is was cool to see the recommendations of thought leaders.
I LOVE this. I totally want the EFF settings! (I use privacy badger). And also, Laura Poitras! What does she recommend? There are celebs of privacy. And regular celebs can seriously use their help!
It’s very creepy, what right do I have to use someone else’s settings and how do I know if they have given explicit consent to let me know what their settings are?
The “90% of your network” function is less helpful, as it seems 90% was the standard on all, which means no one is actually paying attention to these settings, but accepting the default.
“Social” privacy needs explaining but beyond that, love it. I want to outsource this decision-making process. And there are people I trust when it comes to privacy. Love how this helps me find the ‘middle path’ between a clamshell phone and giving up everything.
Michael Boeke, co-founder of Synap, talks about the fine line between creepy and cool when designing for trust. Based on the user feedback we received, we need to increase transparency for the social features to be more cool. If we were to continue working this project our next steps would be to:
- Test social feature
- Include real-time visual feedback on settings change with enabled features
- Card sort of terms used