Data | Ethics | Governance

Article Summary – Miscellaneous AI, Ethical Algorithms, and Big Data (Pt. 1)

  • 2017, How He Used Facebook to Win
    • “according to Grassegger and Krogerus, Cambridge Analytica had used psychological data culled from Facebook, paired with vast amounts of consumer information purchased from data-mining companies, to develop algorithms that were supposedly able to identify the psychological makeup of every voter in the American electorate”
    • “even more troubling was the underhanded way in which Cambridge Analytica appeared to have obtained its information”
    • “Facebook’s real influence came from the campaign’s strategic and perfectly legal use of Facebook’s suite of marketing tools”
    • “using psychological traits to craft appeals to voters, she wrote, wasn’t anything new—every candidate was doing it”
    • “Parscale launched Trump’s digital operation by buying $2 million in Facebook ads”
    • “the campaign was running 40,000 to 50,000 variants of its ads, testing how they performed in different formats, with subtitles and without, and static versus video, among other small differences” … “A/B testing on steroids”
    • “Cambridge Analytica had deployed its psychological targeting techniques during the Republican primaries on behalf of Ted Cruz, but Cruz’s failure to win the nomination was cited as evidence that Cambridge Analytica’s models were ineffective and that the company did not understand American politics”
    • “Parscale’s strategy of using Facebook’s “dark posts” also turned out to matter, enabling the Trump campaign to attack Clinton with targeted negative ads that flew below the public radar”
    • “what our Facebook president has discovered is that it actually pays only to please some of the people some of the time. The rest simply don’t count”
  • 2017, The role of expertise and judgment in a data-driven world
    • “strategy to me, when you boil it down to first principles, is three things. Number one, it’s an assessment of what you think truth is today. Number two, it’s a prediction of what you think truth is tomorrow. And number three, it’s a decision of how you’re going to place your resources amongst any number of different alternatives based on your prediction of truth”
    • “algorithmic prediction, which is essentially the use of the available bodies of data in order to predict the future, has replaced expertise inference”
    • “declining response rates across all forms of survey measurement”
    • “companies, whether it’s with their consultants or the consultants themselves, are going to have to think deeply about how they’re revising their survey research to take these biases into account, because they are structural, and not just unique to the political sector”
    • “what I … don’t think the data did do, was [have] an understanding of human story and human narrative”
  • 2017, How Harley-Davidson Used Artificial Intelligence to Increase New York Sales Leads by 2,930%
    • Adgorithms … works across digital channels, like Facebook and Google, to measure, and then autonomously optimize, the outcomes of marketing campaigns”
    • “once it determined what was working and what wasn’t, Albert scaled the campaigns, autonomously allocating resources from channel to channel, making content recommendations, and so on”
    • “AI systems don’t need to create personas; they find real customers in the wild by determining what actual online behaviors have the highest probability of resulting in conversions, and then finding potential buyers online who exhibit these behaviors”
  • 2017, How Companies Say They’re Using Big Data

  • 2017, NYU Law’s Algorithms and Explanations
    • “can we develop algorithms (of any kind) that some day might satisfactorily explain their actions?”
    • “can we satisfactorily explain the actions of the algorithms we’re using in the real world today?”
    • “supervised learning models are nearly always trained with biased data and are often trained to optimize the wrong objective (e.g. clicks vs newsworthiness)”
    • “The machine learning community generally lacks the critical thinking skills to understand the question [that the model is trying to capture]”
  • 2017, The Right Not to Be Subject to Automated Decisions Based on Profiling
    • This paper needs its own post (to come)
  • 2017, Big Data, Ethical Futures (linked page removed)
    • “‘ethical dilemmas’ are often signs that our methodological techniques are stretched too thin and failing us”
    • “when, if ever, is it ok to play with someone’s data if there’s no evident harm but we have no way to clearly test the long-term impact on a nebulous number of end users?”
    • “the reality is, when it comes to studying human interaction (for profit or scientific glory), it is no more (or less) complicated whether we’re interviewing someone in their living room, watching them in a lab, testing them at the screen, or examining the content they post online”
    • “it’s often acceptable in human subjects research to conduct experiments without prior consent, as long as everyone discussing the case agrees that the experiment does not impose greater risk to the person than they might experience in a typical day” … but “at some point the research subjects are told (“debriefed”) about their participation in the study and given the option to withdraw data collected about them from the study”
    • “it’s hard for anyone studying the digital signs of humans interacting online to know what people mean for us to see—unless we ask them”
    • “scientists and technology companies scrutinizing data bubbling up from the tweets, posts, driving patterns, or check-ins of people are coming to realize that we are also studying moments of humans interacting with each other. These moments call for respect, trust, mutuality”
  • 2017, Root Out Bias from Your Decision-Making Process
    • “it is easy to see how one can fall into the trap of making the decision first and then finding the data to back it up later”
    • “few people set out to make a rigged decision, but when you’re pressured to make a choice fast, you may fall victim to a flawed process”
  • 2017, Who Will Pay for the Future if Not the Robots?
    • “hard to imagine a future in which the US economy loses a third of its jobs to automation and governments just sit on their hands”
    • robot = “a system that exhibits ‘complex’ behavior and includes sensing and actuation”
    • “ambiguity in defining what a robot is will likely also open up accounting loopholes”
    • “in theory at least, robots spur greater overall wealth through increased productivity”
  • 2017, A case study in combating bias
    • “framing [bias] not as a personal defect but as something that’s just there”
    • “we also saw champion and sunflower biases, which are about hierarchical patterns and vertical power distance”
    • “there was a feeling within the rank and file who produced the investment valuations for major decisions that certain scenarios were not desired—that you exposed yourself to the risk of being branded an eternal naysayer, or worse, when you pushed for more pessimistic scenarios”
    • “we’ve now made it mandatory to list the debiasing techniques that were applied as part of any major proposal that is put before us as a board”
    • “important for us to start to create an atmosphere in which people are comfortable with a certain degree of conflict, where there is an obligation to dissent”
    • “we now appoint a devil’s advocate—someone who has no personal stake in the decision and is senior enough in the hierarchy to be as independent as possible, usually a level below the executive board”
  • 2017, The Long, Slow, Rotten March of Progress
    • “if all the work goes to machines, how do all these new programmers feed themselves”
    • “how often does anyone have a really good idea? What you actually get is just code, sloshing around, congealing into apps and firms that exist simply to exist”
    • “behind the network-ready grins of all the digital sibyls, beaming about smart appliances and AI interfaces from dozens of stages, I could see something familiar from the Battle of Jefferson Davis Parkway: the rabid determination to keep on building a broken world. We don’t need a tech revolution, we don’t need to learn how to code. What we need, at long last, is for something to finally change”