SI_News_02 Ethics in Market Research_Seite_01

Making the Case for Ethics in Market Research

What can market researches do to build trust? There are new tools which technically raise privacy to a new level. Understanding and using those is a business opportunity.

In order to discuss the topic of ethics in market research we have to draw a distinction between “ethics“ and “compliance“. Ethics refer to a strong association to your organizational values, and “compliance“ evokes a legal obligation to do something like manage risk or adhere to regulations specific to privacy, for example General Data Protection Regulation or GDPR.

Trust is foundational to this topic and we may consider the thesis that there is a trust deficit. Why? In fact, trust should be core to general business and research practice as a whole. Trust as the old adage goes, takes many years to build, seconds to break and forever to repair.

To get from a metaphorical level to current research findings, please consider the following examples:

  1. 39% of UK public don’t trust marketers with their personal data.
  2. According to a recent poll by the Global Research Business Network, only 34% of respondents from around the world said they trusted market research companies — similar to the level of trust they have in government.”

How data and research may be abused

The focus on the customer is one aspect of this narrative. For example, researchers may be unaware of the customer’s motives for the research and how the outcomes could be abused. The Cambridge Analytica scandal may indeed be the top of this iceberg. In the market research world, this polarity between consumer, enterprise and the potential for abuse presents questions about the governance model. Who is the target of this research and for what purpose?

In an article first published in 2010 by Marcus J. Schmidt, entitled, “Ethics in Marketing Research”, the author had pointed out some areas that are ripe for abuse such as “The market research agency may disguise errors in the survey or in statistical computations. Sometimes, clients have a hidden agenda and try pressurizing agencies to make them come up with certain prespecified findings. Clients may misuse a methodological framework developed by an agency and agencies may inappropriately reuse a research design worked out exclusively for a specific client.

Data mining techniques certainly have the potential of eroding trust and privacy. Frankly, there is not a day that passes without a breach or an abuse of data privacy being posted. In the end, what is required is a model that encourages transparent handling of data. Paving the path toward creating a “fair data” framework that could be standardized and may go beyond what we see today.

Protecting privacy – there is much more than guidelines

Data Ethics and Enhanced Privacy Techniques intersect with one another. Data Ethics are strategic for organizational boards as there is an opportunity to understand what data transparency research guidelines must be. In fact, the European Union has published recommendations for board level functions in this space. However, data privacy enhancing techniques are indeed a space to follow. The hypothesis is that data is never really anonymized especially when there is reference to so called metadata or data about data.

Self-Sovereign Identity or SSI is an example here. There are several principles that encompass SSI one of which is selective disclosure. Selective disclosure translates to disclosing what is required for the situation. Hence, SSI is contextual.

Earlier this year, Texas launched its Texas x Texas online portal, a single location for residents to access government services and connect with local agencies, and plans to make a TxT mobile app available for download in 2022. Both examples could be considered first steps toward self-sovereign identities (SSIs), where residents’ personally identifiable information is digitized and controlled by each individual, not a government agency.

SSIs are different from a physical form of identity, like a laminated driver’s license or a paper vaccination card. Physical IDs can contain a variety of information related to a person’s identity (birth date, home address, etc.). With a digital SSI, users control what and how much information they share and whom they share it with, and they can choose to provide only the information necessary to complete a transaction. No information is kept by the organization asking for identification, and everything remains in the user’s control.

It’s not unusual for residents to interact with their local agencies multiple times every year. One person might renew a driver’s license, apply for a building permit and pay taxes in a single month. Each interaction requires signing in to different portals, with different credentials, and filling out the same personal information, even if not all of that information is required for the transaction.

With an SSI, the individual only gives out what is required for that specific transaction and can dictate how that data is used. For example, a person applying for a liquor license for a business may not be required to provide home address. With an SSI, they don’t have to — they can just provide whatever information is necessary to complete the transaction. The same goes for something like a vaccine passport — there‘s no need for a Social Security number to be shared during the validation process. It‘s enough for a user to verify vaccine status — no other personal details are required.

More tools enhancing privacy

The toolkit against data leakage includes Differential Privacy, Homomorphic Encryption, Secure Multiparty Computation, Zero Knowledge Proofs or ZKPs. This is an alphabet soup of Privacy Enhanced Techniques or PETS. In fact, Gartner believes that over half of organizations will implements PETS by 2025. Differential Privacy applies a mathematical definition to privacy itself. Homomorphic Encryption [HE] permits computations to be performed on encrypted data without access to the secret key. We commonly see HE used in cloud implementations with a variety of use cases that include healthcare as an example.

Secure Multiparty Computation or sMPC can be thought of as a cryptographic primitive where functions can be jointly computed without revealing the outputs. We have been witnessing the commercialization of Zero Knowledge Proofs or ZKPs over these past several years. The application is that I can prove something is true without revealing the data that proves it. One common use case cited is age, “I am between the age of x and y.” Bringing academia and enterprises together for ZKP has been the motivation for the Zero Knowledge Proof Standards Group.

Understanding how these PETs can be applied in market research is an industry opportunity. Finally, what kind of Ethics-Trust Barometer do you want to create? We have only discussed the tip of the iceberg on this topic on making the case of ethics in market research. I do look forward to continuing the discussion further!

Monique Morrow

Monique Morrow

The Humanized Internet

Senior Distinguished Architect for emerging technologies at Syniverse Technologies

The autor
With over 25 years’ experience as a global technology leader, Monique Morrow is Senior Distinguished Architect for emerging technologies at Syniverse Technologies where her main role and responsibilities are to provide thought leadership and to develop the strategic direction and vision for Syniverse‘s emerging technologies across the company, partners and industry forum.
Monique‘s expertise are in cybersecurity, privacy, mobile payments, ethics in extended reality and distributed ledger technologies for example, blockchain.

Contact
monique@thi.ninja
+41 79 334 5723

https://thehumanizedinternet.com/

Alle SWISS INSIGHTS News finden Sie hier:
SWISS INSIGHTS NEWS

Share this post