ADVERTISEMENT

What If Data Scientists Had Licenses Like Lawyers?

What If Data Scientists Had Licenses Like Lawyers?

Data scientists, if they’re poorly qualified or act irresponsibly, can do at least as much damage as lawyers and doctors. The algorithms they create can ruin lives, aggravate social divisions, even facilitate genocide.

Which makes me wonder: Why shouldn’t they have a professional association to guide and police their behavior, like lawyers and doctors do?

Anyone who doubts the power of professional accreditation need only witness the antics of Donald Trump’s lawyers as they seek to challenge the 2020 election results. As soon as they enter a courtroom, their fraud claims dissolve. For good reason: They know they can lose their licenses or even be charged with crimes if they knowingly lie or misrepresent facts to a judge. At a time when truth and honesty seem vanishingly rare, it’s like a miracle.

Granted, occupational licensing has its downsides. As Milton Friedman famously argued, it can insulate incumbents from competition and increase the prices of services. In some cases — such as florists and barbers — it has probably gone too far. That said, I think I’m not alone in being willing to pay more to ensure that buildings don’t fall down, doctors aren’t total quacks and lawyers aren’t utterly corrupt. And in such crucial areas, where quality can be difficult for individual consumers to assess or act upon, standardized requirements are certainly better than often-biased and easily gamed rankings such as Yelp.

Consider what associations require of lawyers. They must pass the bar exam, which defines what it means to be professionally informed and qualified. They must attend yearly ongoing education, to ensure they stay abreast of developments in the law. They must adhere to ethics standards, many based on the American Bar Association’s Model Rules of Professional Conduct — rules that have managed to contain even Trump’s lawyers.

Data scientists have none of this. Although they have many skills in common, there’s no standard curriculum, and boot-camp programs often fail to provide important technical background. Ongoing education is voluntary, despite its necessity in a field where tools are built and discarded regularly. As regards ethics, behold the case of Facebook, where data scientists reportedly weakened an algorithm designed to demoted posts deemed “bad for the world” because it threatened to reduce profits.

So what would requirements for data scientists look like? Although there’s no obvious way to build the perfect test for technical skills, it would certainly include basic data wrangling and algorithm training, implementation and testing. Its drafters could also borrow ideas from actuary licensing, such as a thorough understanding of linear algebra and statistics.

The ethical standards would have to recognize the peculiarities of the job. As opposed to lawyers, who interact directly with judges and clients, data scientists interact primarily with their employers, typically large technology companies. Their loyalties are split between the people who can fire them and the public they might be harming. 

So, where lawyers are required to “Respect the Rights of Third Persons” — for example, by not taking advantage of people who don’t know the law or don’t have legal representation — the rules for data scientists might focus on more fundamental questions. Does this algorithm violate the law — say, by discriminating against people according to race or gender? Does it exploit people’s data or attention in ways that they could not have anticipated or knowingly approved? Professionals should be responsible not only for their own actions, but also for reporting any violations they witness. This could help put a damper on socially undesirable decisions of the Facebook variety.

Data scientists in particular, and technologists in general, build the digital architecture we all rely on. While a poorly engineered website simply won’t work, a terrible algorithm can go undetected and unmitigated for months or years. Considering the increasing role such algorithms play in people’s lives — deciding what they see, whether they get credit, whether they get hired, where they go to college, how much time they spend in prison — placing some responsibility on their creators seems a reasonable ask.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Cathy O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”

©2020 Bloomberg L.P.