Privacy and security issues around mobile content
By Alex Hall
There are three key areas of consideration when looking at privacy and trust in mobile content:
1. Are content owners clear and transparent with the information they are requesting and why they require it?
2. Is it obvious to users what they are agreeing to, and what the end-benefit is for them?
3. Do content owners have the systems and processes to ensure that user data is safe and secure?
If the answer to any of these questions is no, there is a high risk of users losing faith in the brand or service, or worse, opening the brand up to public or governmental scrutiny.
If we assume the answer is yes, there are still key issues to address in balancing the user’s perceived value, ensuring a fluid experience within your environment, and adhering to industry best practice and various regional privacy laws.
The mobile content arena is already marked by almost too much choice.
Content services need differentiation, and brands that transparently offer increased utility in exchange for semi-private information such as location data, address book contacts and photo albums are well received.
Most users are eager to allow access to this information for increased ease of use and efficiency in tasks including communicating with friends, shopping and offers.
The brands that are the most successful provide a real or perceived value for what most users see as a relatively small amount of data, with most not feeling the least bit intruded upon.
Where the backlash begins is when brands ask for an inch and take a mile usually by accessing personal information without asking. This was the case with Path, the popular social networking service, which has since prompted Apple to build in added restrictions and notifications when applications access a user’s contacts.
Facebook has been a popular privacy pariah, with its frequent and often unannounced changes to privacy policies in service of giving advertisers greater access to user data.
Ultimately, if the utility adds enough value, given the context, then the user base will likely grow with no brand backlash or privacy concerns. But even the mightiest Goliath can be felled by one noisy David’s letter to their Congressman.
Speaking of government, regulations are ultimately intended to protect the end-user, but the number of apps that individuals have in their collection makes scrutiny of privacy statements very unlikely.
Users will trust until they have a bad experience, and that experience can come in many forms, from unsolicited advertising to repeated opt-in data or sharing requests.
I believe that users need to be deemed capable of making the subjective decision of whether the service offers enough utility for them to continue to use it, whether it is based on value or impediment to use.
Regulation, theoretically, should only need to be minimal if you trust in the increasingly high standards of user experience demanded by users.
From a behavioral standpoint, there are several generations who are already making these decisions.
Companies in the digital space that fail to deliver value or betray trust do not last long, whether it is Groupon or the multitude of smaller companies trying to make a mark.
Times are tough when even Facebook is worried about whether a popular uprising could consume them, hence its aggressive approach to ensuring that it remains the primary home for photo-sharing/storing amidst ongoing privacy outcries.
IN SUMMARY, it stands to reason that we should expect a minimum level of regulation, but also allow companies to determine how they want to communicate their data needs within the context of optimizing the user experience.