In a recent post on Privacy Maven, we reviewed the challenges with Google’s decision to make agreements behind-the-scenes with health groups to gather health data. In Maven Analysis: Privacy Is Not On Your Side in US Health Data Sharing, Privacy Maven looked at the challenges of current healthcare privacy law in the United States, and the lobbying challenge that is taking place in Washington, D.C. to create national privacy law that continues to support the data desires of Silicon Valley.
A blog post today from Google entitled, “Tools to help healthcare providers deliver better care” outlines some of the approach and rationale that Google has taken when making these agreements, and some of the reasons that consumers should not be fearful of the way this data is being gathered and used. This is the type of information that should be published by companies before the agreements are made, and in a way that also allows people to decide whether they are OK with their data being included in the dataset. As a result of this reactive approach, the net/net is that people don’t trust that Google is being a good steward of data because they have not built up this trust with their customers in the past.
Part of the problem comes in the way that Google initially responded, and continues even in this latest blog post with their only defence being, “we will not use this data to sell ads” and variations on this theme. They do not talk about the other ways they will use the data within Google, and who they will share it with for use cases that are outside “selling ads.” They focus on one piece that they will not do, and ignore the rest. This approach does nothing to build consumer or customer trust in the brand through transparency and real choice.
Companies today have a chance to build their reputation of trust with their customers in a number of ways, and similarly have many opportunities to lose that trust as well. Recent studies find that consumers are increasingly patronising businesses that align with their ethics and values. Taking time to define your ethics when it comes to data collection, use, and sharing, and communicating those values to your customers will build trust in your brand, and encourage those customers that buy on such alignment will think of your brand when the time comes to do so.
Secratic’s Digitial Ethics Workshop is a great way to walk through the areas your company uses and plans to use data along with a critical review of the boundaries you, as a business, are willing to expand or limit to will help you better be able to communicate your position to customers and employees and know where your ethos is rooted so that you can stay true to your values as data collection and use expand.
Secratic’s Data Use Institutional Review Board (IRB) Program can then take and help your company implement a lightweight governance process that suits your needs. The Data Use IRB identifies and includes a select group of key stakeholders across your company along with a repeatable process to review and authorise the new or expanded collection or use of data to ensure your business, legal and ethical positions are accounted for.