Lexpert Magazine

July 2019

Lexpert magazine features articles and columns on developments in legal practice management, deals and lawsuits of interest in Canada, the law and business issues of interest to legal professionals and businesses that purchase legal services.

Issue link: https://digital.carswellmedia.com/i/1148218

Contents of this Issue

Navigation

Page 21 of 24

22 LEXPERT MAGAZINE | JULY 2019 if not zero. For example, you walk into a public building, and there is a sign telling you there are surveillance cameras captur- ing your image. But there is no real effective method of gathering your express consent to being recorded. e OPC has been saying for some time now that the very concept of consent has to be re-thought, particularly in certain digital and online circumstances. erefore, likely what will be in the statute's overhaul are a few concepts that address this important matter. First, there may well be direction on how data handlers have to behave when consent is not practical. And second, there likely will be provisions allowing for collection of certain information without consent, provided that certain protections are imple- mented and properly exercised during the collection of the information. For example, some emphasis may be put on data de-identification. at is, there would be circumstances where cer- tain specific elements of personal infor- mation can be collected without consent, provided (and this is a very important proviso) the information is immediately stripped of those elements that could link it to a certain individual. And ideally this process of de-identification will be per- formed by the machine that captures the information in the first place, so that the identifying elements are never uploaded to the collecting entity's main computers (rather, in those computers one will only have anonymized data). Moreover, the collecting organization will also have to take steps to avoid any possible "re-identification" of the now ano- nymized data. And the good news is that there is an entire technology and process discipline around these safeguards that is increasingly mature, so that the effective risk of re-identification is getting smaller and smaller. e helpful irony here is that very oen in the computer law space, while it is technology that kicks up the challenge, it is another technology that offers the solu- tion (or a big part of it). To Be Or Not To Be Speaking of de-identification, expect the new privacy law to address the so-called "right to be forgotten". Let's say some time ago you were declared bankrupt – and it was so long ago that the official govern- ment record no longer shows you to be a discharged bankrupt. But guess what – if you do a search (or more to the point, if a prospective employer does a search) online, various references to your bankruptcy can be found. What the "right to be forgotten" affords you is the ability to ask the operator of the search engine to change its records such that your bankruptcy does not come up in the online search. It`s an interesting right, and not one without vocal spokespeople on both sides of the issue. Some argue it is dangerous to try to "eradicate the past", and that full transparency requires the truth at all times, however embarrassing that might be. On the other hand, some argue that the public policy rationale for "wiping clean" certain public records should simply be carried forward into the internet search realm – otherwise, the public policy objective of "redemption" will be thwarted. How Did You Do That ? Speaking of transparency, another very topical issue that may well be addressed in the upcoming legislation involves artificial intelligence (AI). Let`s say you apply for a loan with an online fintech. ey actu- ally boast that their AI-adjudication engine works so fast that loan applications can be vetted, and a decision made, in a matter of minutes. Now, let`s say you are refused the credit. You reach out to the fintech and ask – "why was I turned down for the loan" ? In pre-AI days, this sort of question had a fairly straightforward answer. e finan- cial institution had relatively clear lending criteria, based on your income, and other standard factors, and if you met the criteria, you were given the loan, and if you did not, you were refused a loan. Simple. ings can get more complicated when an AI credit approval engine is added to the mix. e engine compares your data to millions of other customers of the fin- tech or of a credit bureau service, or per- haps tens of millions customers across the entire industry, and based on actual data patterns involving repayment and default (together with a range of other interest- ing data points, the AI system makes a decision. But here's the thing. e actual, specific factors relevant to your credit ap- plication may not actually be known, or at least not known in the ordinary course of the fintech's business model. To counter such a situation, the new legislation may provide for "algorithmic transparency"; that is, when you ask about why you were turned down for the loan, the fintech actually has to figure out how the AI engine came to that decision. is would be a very interesting provision, and would, in effect, make users of AI appli- cations accountable for their computers. One concern that it would address, is that there is a risk that AI soware would perpetuate various biases, if the relevant data sets used to teach the AI engine are themselves susceptible of bias. Solving for this sort of concern will have some very interesting knock on effects. For example, if you are with a financial institution, and you are acquiring such an AI credit ap- proval engine, you may want to address "The fairly new European privacy law that came into effect last year, provides for a maximum fine equal to 4% of the worldwide sales of the company that breached the statute."

Articles in this issue

Archives of this issue

view archives of Lexpert Magazine - July 2019