数字世界的隐私

经过Srinivas poosarla,ramesh n 1月2020年1月|简要|10分钟阅读|Email this article|Download
Data privacy has emerged as an important aspect of human rights. However, its fulfillment is challenged by the organizational and individual’s desires to reap the rewards of a digital marketplace. Enterprises need to be responsible with the way they obtain and use data. A strategic decision is whether they want to use privacy as a differentiator or treat it as another compliance burden.

您自己的数据不再私密

想象一下,在工作中艰苦的一天之后,你试图使用智能手机预订驾驶室。在检查票价是否适合您之前,您会意识到您的智能手机的电池几乎已经死亡。不绝望,尽管票价非常高,你确认了驾驶室。这种高票价可能不是驾驶室短缺的结果,而是触发以利用您的情况。

在看似客户友好的UX后面,驾驶室聚合器的算法不知疲倦地从您的智能手机收集数据,包括其“剩余电池电量”。对于一个,Uber否认了这种票价决定,但已承认收集电池水平的信息。根据优步发言人,电池水平是“你是否将对浪涌定价敏感的最强预测因子之一”。1Even more concerning are the possibilities that exist when personal information, more than what is required, is available to service providers.

这是另外一个场景:you are browsing on your smart TV and are shown advertisements of fast-food joints — this, after you have ordered dinner from your laptop. It dawns on you that your user behavior across internet-enabled devices is being tracked. According to a complaint filed by the FTC and the Office of the New Jersey Attorney General,2Vizio智能电视的业主不知道在他们看电视时,Vizio又又在看着它们。它通过在屏幕上收集匹配电视,电影和商业内容数据库的屏幕上的选择来实现。

The company also identified viewing data from cable or broadband service providers, set-top boxes, streaming devices, DVD players and over-the-air broadcasts. In order to enable this for older models, VIZIO retrofitted the TVs by installing its tracking software remotely — all without the knowledge or consent of its customers. Our private lives within our homes are no longer private — whether we are watching TV or browsing the internet.

最后:您发现您的汽车保险费中的飙升 - 没有理由,鉴于您的意外历史。后来您发现您的保险提供商使用远程信息处理根据从速度,加速和驾驶通常的时间产生的输入产生的驾驶行为来收取保费。3All these data inputs were within legal limits and were received from various “internet of things” sensors installed in your car. Yet the sale and use of this data happen without your consent.

这些都不是假想的例子——它们是意图l-life instances of privacy infringements or risks that can and sometimes do emerge from unscrupulous use of technology. At the same time, innovative technology delivers enhanced value and immense benefits to consumers. The aspiration should be to make this a positive-sum game. Organizations that benefit from the use of personal information should deploy appropriate privacy safeguards and empower consumers with knowledge and choice on how information collected from them could be used.

The evolution of privacy

隐私是人类以来的基本特征,以来历史历史时期,从早期人类文明。在食物通过狩猎聚集的日子里,部落在一个洞穴中居住的部落的需求优先于隐私,但随着文明进入农业和农业,依赖于一个屋檐下的依赖,需要隐私的需要进化。在那些日子和大约一个世纪前的隐私的概念是一个孤独的权利,而不是侵入互相个人的物理空间 - 作为榜样。技术改变了隐私的概念,焦点从一个个人生活的“物理空间”转移到“个人”。

Over the past few decades, data privacy has emerged as an extremely important dimension of human rights. However, its fulfillment is constantly challenged by the need to embrace the rewards of digital marketplace. The digital society is omnipresent — in the organizations we work for, mobile apps we use in our day-to-day lives, e-governance public utilities we avail ourselves of as citizens and social media on which we connect — all of which are a necessity today.

因此,许多国家要么加强或制定数据隐私法规,使组织负责并尊重个人对其数据的选择并保护其隐私。欧盟和加利福尼亚州已经颁布了一般数据保护条例(GDPR)和加州消费者隐私法案。在印度,个人数据保护条例草案草案是由联合议会委员会审查。

Privacy and its contexts

隐私是上下文敏感的。个人每天都与家人,朋友和同事相互作用。他们期望与每个小组共享的信息仍将纳入同一组内。This reasonable expectation of privacy that exists in the physical world, that one should have (a) control of one’s information, including basic identity, and (b) control of how such information may be used by others, continues to hold even when one interacts with the internet and connected devices.

According to Professor Alan Westin4,有四个隐私州个人在不同的时间需要。

孤独,这是一个关于剩下的分开和从社会生活中撤出,当人们想要愈合,休息和准备与社会重新演绎时最轻松的国家之一。

亲密,当人们想要与配偶或亲密的朋友交换思想,感受,情感和观点,或者从家庭律师,医生或辅导员中获得专业能力的帮助。

保留,当一个人内存在时,但与其成员通信时需要不共享某些信息或被注意到。

匿名,它是通过在人群或公共场所实现隐私。在这种状态下,凭借在人群中,购物中心,敬拜地点,戏剧,酒吧,体育赛事等,没有注意到或认可。

大数据和AI模糊数字社会环境之间的界限

隐私的所有这些维度不会破坏个人在社会中成为社会社会的需要,而是在回到社会舞台之前,为内心,不受欢迎的经验和恢复能力提供机会。在与家人在一起时,朋友和同事们为个人提供了学习的机会,另一方面,隐私,给予探索事物的自由,从错误地区分了思考并发展一个人的意见;它允许人们在没有侵扰职业,生活伙伴关系,宗教等方面没有入侵。因此,隐私是个人的必要成分,因为社会的社会是在社会中成长并成为一个完整的个人。

But big-data-driven technologies, along with the use of artificial intelligence, have blurred the boundaries between various social contexts in the digital world. Health monitoring apps, social media sites, job sites and dating apps access users’ digital footprints to know and often predict when one is likely to be pregnant, depressed, going through a breakup or rejecting a job offer.

技术对隐私的影响

了解由于缺陷导致的隐私所造成的固有危害是数字技术介绍的是确定合适隐私保障的先决条件。关于信息和决策隐私所产生的个人的三个影响领域是:

Public disclosure of private facts

当有关于个人的私人信息(例如健康和银行信息),没有个人知识或同意,可能导致不可逆转的伤害。它也可能在失去声誉,财务损失,物理损害和歧视方面对个人造成痛苦。除了安全漏洞之外,还可以发布信息,也可能是由于窃听等技术的恶意或故意部署,通过移动应用激活麦克风或跟踪位置。例如,基于瑞典的应用程序的TrueCaller从他们的地址簿中导入智能手机用户的联系人,而无需在其数据库中获取其联系号码的人员的同意。虽然该应用程序有助于促进言论和信息自由,但它会影响匿名,这是民主社会中同样重要的方面。家庭自动化或国情,使用智能扬声器和物质设备,提高便利性和效率,但可以影响亲密关系,除非进行适当配置。大多数国家的侵权行为或民法以及隐私法规范了这种侵入隐私。

潜意识的影响

Often, information about an individual such as social media likes or dislikes, purchase preferences, reading habits, religious beliefs, associations, is analyzed along with other data for the purpose of profiling. The insights derived from such profiling, can be used to influence the individual’s mind with the intent of steering towards certain desirable behavior. This in turn impacts individual’s autonomy to take informed decisions, nudges them into behavioral change and creates bias at sub-conscious level. An e-commerce website suggesting books that may be of interest to an individual is a harmless and beneficial use case. However, a filter-bubble-enabled search engine that shows results driven by an algorithm prevents the individual from seeing a neutral set of results. The most infamous example may be of Cambridge Analytica5揭露了这种潜在的影响可能对个人和社会的广泛影响。法律正在出现,主要是在新生阶段来规范这一领域。

个人制定的自动决定

当关于个人的决定仅通过使用从各种来源获得的数据的算法而制定的,但由于缺乏人为干预和自由裁量权,个人可能受到个人挑战的结果。虽然AI提高了效率,但它缺乏同理心。适合就业,基于学分历史的健身应用和贷款资格的数据的健康保险溢价计算是自动决策广泛应用的几个实例。由于机器学习更具数据和统计驱动,算法的功效取决于所选择的变量等设计考虑,用于训练算法的数据量和从各种来源获得的数据的准确性。6

GDPR等法规7have emerged that stipulate the right for an individual not to be subject to a decision or profiling, based only on automated processing that significantly affects the person, including legally. In the United States, the Consumer Online Privacy Rights Act bill was introduced in November 20198当尚未通过,它解决了算法决策问题。它要求那些从事这种做法的人每年为准确性,公平,偏见和歧视进行影响评估。然后,他们才能促进住房,教育,就业或信贷的广告或资格确定。

最小化隐私风险的钥匙

几乎所有组织都会收集和处理个人数据。隐私风险曝光因素而异,具体取决于通过从个人数据的价值推导出业务的程度。网络安全涉及已知或未知的黑客,组织机械旨在保护企业。在数据隐私的情况下,除了可能窃取个人数据的黑客之外,营销,人力资源等内部功能,凭借其核心业务流程,它成为隐私侵犯的潜在来源。

虽然可能难以实现技术的全部潜力而不会影响一定程度的隐私,但值得注意的是,隐私不是绝对权利。将隐私嵌入设计中,并赋予随后受到尊重的有意义选择的人,将使方法技术不可知并最大限度地减少组织风险。

Embed privacy into design while adopting innovation, to make it a positive-sum game

组织应建立治理,以确保其战略业务目标与涉及个人数据的流程的隐私目标保持一致,并管理潜在的不合规领域引起的风险。随着全球化的增加,组织从不同地理位置的数据处理数据,并受到不同隐私法规的国家的数据隐私法规。采用国际隐私标准,如ISO 27701,该组织的隐私信息管理系统(PIMS)可以模拟,这是谨慎的。制作“设计隐私”组织流程的一个组成部分最大限度地减少了与隐私原则的不合规风险。

From the beginning of the data life cycle, minimize data collection and processing regardless of whether the data is collected directly from the individual or indirectly, such as IP address. Since the technology industry benefits from more and more information due to Big Data dividends, risks can be minimized by segregating essential and optional information, for which ‘informed consent’ can be used as a lawful basis of collection for the latter category. Collecting specific data to process a service may be necessary — for instance, mobile phone numbers to provide multifactor authentication, but sharing data with third-party agents offering value-added services may not be made essential when offering a service. Individuals must be able to expose a minimal amount of personal data — for instance, the routing of calls to a cab driver through the cab aggregator’s business telephone number.

在数据收集点通过模块化和分层隐私通知进行个人信息处理更透明。这些通知应包括有关分析和自动决策的信息。收集的个人数据必须比例,而不是过度过度,并且只有必须收集和处理特定于目的信息。只要必要,个人数据应仅保留,并且在满足收集和保存的目的时,必须删除或匿名。在法律上没有强制性的情况下,数据转移到其他国家或披露,必须根据适用司法管辖区的要求基于选择退出或选择退出机制。

同意通常被解释为用于收集和处理任何个人数据的银弹。实际上,同意永远不应该用作问责制的替代品。个人在隐私声明中发现很难破译精细打印和复杂的处理细节,以确定什么是正确的。关于隐私声明的资料卷并不豁免组织遵守隐私原则的问责制。

使用AI自动化决策的组织必须确保数据准确性并避免算法不透明度。他们必须向个人通报数据来源以及用于制定影响它们的决定的逻辑。例如,从过去访谈数据训练的招聘相关算法必须避免在物理选择过程中可能普遍存在的种族或性别偏见。当决策对个人具有法律影响时,可能需要额外的保障措施,包括根据适用法规要求寻求人为干预的权利。必须通过定期审查算法和它们产生的结果来避免不公平的偏见。使用大数据时必须使用诸如差异隐私的隐私增强技术,以最大限度地减少重新识别匿名数据的风险。应根据可能导致的人权和自由造成的危害,部署多种隐私措施,包括与安全相关的控制。

Value privacy or pay the price

数据隐私威胁是真实的。监管机构和法规只会变得更加严格。消费者意识水平将上升,它是一个组织来决定他们是否希望将隐私用作差异化或将其视为另一种合规负担。隐私合规性成本很大,但不合规成本也很高,并且超越了对品牌本身的法律影响。

References
  1. https://www.independent.co.uk/life-altyle/gadgets-and-tech/news/uber-knows-when-your -phon-is-about-to-run-out-of-battery-a7042416。HTML.
  2. https://metro.co.uk/2019/09/27/uber-charge-battery-lower-10778303/
  3. https://www.ftc.gov/news-events/press-releases/2017/02/vizio-pay-22-million-ftc-state-new-jersey-settle-charges-it
  4. https://www.efma.com/article/detail/30774
  5. 艾伦·威斯汀博士的隐私和自由 -https://www7.tau.ac.il/ojs/index.php/til/article/view/1609/1711
  6. https://newrepublic.com/article/151548/ political-campaigns-big-data-manipulate-elections-weaken-democracy.
  7. https://ec.europa.eu/newsroom/article29/Item-detail.cfm?item_id=612053
  8. https://eur- lex.europa.eu/Legal-Content/en/txt/?URI=COLEX%3A32016R0679.
  9. https://iapp.org/news/au-s-senators-unveil-new-federal-privacy-legislation/