• 34 Posts
  • 923 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle













  • JoBo@feddit.ukOPtoTechnology@lemmy.worldOn Being an Outlier
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    10 months ago

    The way he plays with the meaning of words

    She (or, if you’re not sure, they).

    any kind of bureaucratic or rule-based decision-making

    Human-written rules are often flawed, and for similar reasons (the sole human thought process that ‘AI’ is very good at reproducing is system justification). But human-written rules can be written down and they can be interrogated. But Apple landed itself in court because it had no clue how its credit algorithm worked and could not conceive how it could possibly be sexist if the machine didn’t get any gender data to analyse.

    Perhaps that is the point.

    That is, indeed, the point.




  • This is an easy statement to make but context matters. In this case, he was not named by the media but had they not covered the story, he would never have been charged because it suited the political establishment to do nothing at all.

    Higgins alleged she was raped by a colleague in an exclusive 2021 television interview with the Network Ten’s “The Project” program, which also raised questions about the official response by ministers and political staffers in the aftermath of the alleged assault.

    After the interview aired, Lehrmann was charged with sexual intercourse without consent, but the trial was abandoned in 2022 due to juror misconduct and not revived due to fears about Higgins’ mental health.






  • JoBo@feddit.ukOPtoTechnology@lemmy.worldOn Being an Outlier
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    10 months ago

    The data cannot be understood. These models are too large for that.

    Apple says it doesn’t understand why its credit card gives lower credit limits to women that men even if they have the same (or better) credit scores, because they don’t use sex as a datapoint. But it’s freaking obvious why, if you have a basic grasp of the social sciences and humanities. Women were not given the legal right to their own bank accounts until the 1970s. After that, banks could be forced to grant them bank accounts but not to extend the same amount of credit. Women earn and spend in ways that are different, on average, to men. So the algorithm does not need to be told that the applicant is a woman, it just identifies them as the sort of person who earns and spends like the class of people with historically lower credit limits.

    Apple’s ‘sexist’ credit card investigated by US regulator

    Garbage in, garbage out. Society has been garbage for marginalised groups since forever and there’s no way to take that out of the data. Especially not big data. You can try but you just end up playing whackamole with new sources of bias, many of which cannot be measured well, if at all.