NYC Announces Plans To Test Algorithms For Bias (betanews.com) 79
The mayor of New York City, Bill de Blasio, has announced the formation of a new task force to examine the fairness of the algorithms used in the city's automated systems. From a report: The Automated Decision Systems Task Force will review algorithms that are in use to determine that they are free from bias. Representatives from the Department of Social Services, the NYC Police Department, the Department of Transportation, the Mayor's Office of Criminal Justice, the Administration for Children's Services, and the Department of Education will be involved, and the aim is to produce a report by December 2019. However, it may be some time before the task force has any sort of effect. While a report is planned for the end of next year, it will merely recommend "procedures for reviewing and assessing City algorithmic tools to ensure equity and opportunity" -- it will be a while before any recommendation might be assessed and implemented.
so when the data presents a "racist" result... (Score:4, Insightful)
...that happens to be correct, do they throw the whole thing out to spare everyone's feelings?
Re:so when the data presents a "racist" result... (Score:4, Insightful)
I'm guessing it won't come to that, they won't allow any white people (especially males) on that task force, just to make sure it keeps things "fair".
Re:so when the data presents a "racist" result... (Score:5, Insightful)
Re:so when the data presents a "racist" result... (Score:5, Funny)
Re:so when the data presents a "racist" result... (Score:5, Insightful)
An un-biased algorithm means removing race and sex from consideration. But this is not what is being sought. The results from removing these two factors from school admission or job qualification determinations will produce results that will be automatically labeled racist or sexist. Competency and quality will always rise to the top regardless of race or sex but that is not an acceptable result in today's society.
Re: (Score:2)
Is it legal to advertise more in minority communities in order to expand the pool of minority applicants?
Re: (Score:1)
I guess that depends on how you a measuring competency among people who don't yet work for you (in your example anyway). If you are ignoring race and sex but are using other indicators that are simply standins for race and sex (went to a women's university or a predominantly black university, home address in a minority part of town) then you are ending up with the same problem, you just changed the way the bias is introduced.
Re: (Score:1)
"because racism is pretty rampant here in the US still"
Not really when you look at the diversity present in the US compared against any other country on the planet. Social media has distorted the entire situation to the point that prevents any rational discord on the entire subject. A small number of incidents have taken on the appearance of a problem out of control until you actually compare the number of incidents against the total number of people living in the country. When someone says that the police
Re:You live in Europe or something? (Score:4, Insightful)
Racism! Racism! Racism! So that is the explanation? Sounds pretty simple, and no doubt there are all sorts of training, education, consciousness raising, rules, regulations, reports, graphs, ratios, inspections and oversight that you might suggest to address it.. (Along with many ardent comrades to perform the work!) Reminds me of something H L Menken said: "Explanations exist; they have existed for all time; there is always a well-known solution to every human problem — neat, plausible, and wrong. "
The problem is that assigning all of the problem to "racism" is that you are going to be missing a huge problem that is out there, one that is very old, and whose effects are well know through the ages. And the worst part from a Progressive perspective? It is color blind. Nonetheless, you should consider the following, including the quote from former president Barack Obama.
New Report: Majority of U.S. Teens Don’t Live in Intact Families [dailysignal.com]
. . . Overall, teenagers in intact families are more likely to be emotionally healthy, have higher self-esteem, and progress further in education. Boys who have grown up with their married biological parents are particularly less likely to have behavioral problems, such as heightened aggression or substance abuse. Teenage pregnancy rates are seven to eight times higher for girls whose fathers are absent. As President Obama clearly stated in 2008, the absence of a father is significant:
We know the statistics—that children who grow up without a father are five times more likely to live in poverty and commit crime; nine times more likely to drop out of schools and twenty times more likely to end up in prison. They are more likely to have behavioral problems, or run away from home, or become teenage parents themselves. And the foundations of our community are weaker because of it.
. . . In fact, children who do not have intact families are disproportionately concentrated in the lower third of the income scale. The FRC report reveals harsh realities for children from low-income communities. In Chicago, 86 percent of African American children don’t live with both of their married parents. Many poor neighborhoods across the U.S. show similar realities: 85 percent of children in Detroit and 64.5 percent of children in Richmond, Virginia, were born to single mothers.
How do you turn this around? The good news is that there are some amazing nonprofits whose goal is to help restore strong marriage. One example is First Things First in Richmond, which provides education programs that encourage active fatherhood and strengthen marriage in Richmond’s low-income communities. They teach adolescents and young adults the three keys to avoiding poverty: (1) graduate high school, (2) get married, and then (3) have kids. The order is important. Their results are real: More children are protected from the pain of broken families and the risks of poverty.
The advantage of working to build and sustain healthy families is that it pays dividends in many respects. The problems are not particularly nebulous and there are clear things to be done. Unfortunately, from some perspectives, the prospects for virtue signaling and outrage are not as good, hence indifference. Well, maybe not indifference.
Shapiro on why the Left loves broken families [conservativewoman.co.uk]
The family is a bulwark against the State, so obviously if you are a Leftist who loves the big State you cannot stand strong, independent families.
Re:You live in Europe or something? (Score:5, Insightful)
YES, there is institutional racism here in the U.S. Particularly in colleges and universities. And it has been in place for decades.
It is called "affirmative action". Studies have shown pretty conclusively that it has not worked. And the Supreme Court has signaled that it's probably on its way out.
There was a lot LESS racism in America before Obama got elected. I'm not going to rant about the reasons, but that's a simple fact. Racial tensions were so much higher when he left office than when he began, there is hardly any comparison.
Strangely, a lot of that racial tension has slacked off in the last year. Not all, but a good bit of it.
Why is that, do you think?
Re: (Score:1)
Re: (Score:2, Interesting)
Racism in America ...
From my point of view , racism in America is visible in immigration policies.
I am not allowed to migrate to US because : I am too white, too male, too hetero and too European.
And too law abiding to join The Caravan or overstay my short visit to the US.
And those policies were introduced under progressive Clinton and Obama rules ....
Re: (Score:2)
Explain why you believe that's not what they are seeking?
Do you have some evidence or is this just an automatic Pavlovian response? Or just some cynical "must be oppressing white men" rubbish?
Re: (Score:2)
...that happens to be correct, do they throw the whole thing out to spare everyone's feelings?
They either throw it out or change it to get the result they like. If those are the choices, I'd prefer the former.
Of course they can also go after whoever wrote the racist algorithm.
Re: (Score:2)
Re: (Score:2)
yeah like marijuana use... the way its more predominant in white communities than in black but the arrests for blacks is at a higher rate and the severity of punishment is higher... oh, wait, thats the opposite.
Re: (Score:3)
In order to be fair, wouldn't it be necessary to take into account all the unfairness that took place during the last couple of centuries, and give an edge to the groups who were treated unfairly?
Why just 2 centuries?
Why not 5 centuries? Or 10 centuries? Why only America? Were there no racial injustices in the world until America was colonized?
What about Irish slaves in the pre-civil-war US? There were more Irish slaves than Africans by the time the Civil War occurred. Hell, Irish slaves were cheaper! Don't they get consideration for their slavery?
*You're* being racist in only wanting certain races you care about included.
Strat
Re: (Score:1)
He's right about the Irish slaves, though. King Henry was shipping over countless boatloads of Irish to the Colonies as slaves at far below the prices of African slaves. That;s how England dealt with Irish criminals, troublemakers, potential troublemakers, and any others that annoyed them in some way, often by just being Irish or having/living on land they wanted.
fair (Score:4, Insightful)
Re: (Score:2)
To be "fair", a model might operate on correlations - just a naive fit of the data. Some of these correlations might have a reasonable chance of reflecting cause and effect, and others might be a coincidence. In a perfect world, you'd suss out all of the bogus correlations - but if you are resource constrained, then going after the ones that are reinforcing systemic bias is a reasonable place to direct your efforts.
Re: (Score:2, Interesting)
What if the bias happens to NOT be a spurious correlation?
There's no natural law that says that unwanted correlations aren't real. For example, white men really can't jump as well as black ones. Women tend to be better educated. What we need to come to terms with socially is how to accept uncomfortable findings like that without assigning collective blame, to realize that individuals can be exceptions, especially with effort, etc.
The hard part is dealing with things like crime. We want to stop crime so
Re: (Score:2)
What if the bias happens to NOT be a spurious correlation?
Then cascadingstylesheet's critique is valid.
white men really can't jump as well as black ones
Well, here's a good example... if you have a scientific reason to say this, then fine - it's a cold hard fact. But if you just happen to notice that blacks are overrepresented in athletics, that does not tell you whether blacks jump higher or whether blacks are raised in conditions which result in an emphasis on athletics. If you right an athlete-recruiting algorithm and it weights blacks for no particular reason other than this correlation... that is exactly the
Re: (Score:2)
What seems to confusing the geeks and nerds is the whole idea of bias in algorithms, makes no sense but then you stop and think. What if a couple of racist idiots got the programs to add in racist elements to the data manipulation functions and they simply don't want to fess up to it and instead claim biased algorithms. You know the political appointees who got into office purely based on their electioneering and nothing what so ever to do with their honesty, reliability, skill set or fitness for the job. T
Reality is racist (Score:2, Insightful)
It's the typical response you get from liberals when you point out that their reasoning doesn't match the world: reality is racist.
Try and bring crime statistics or other facts into a discussion about how weird it is that the areas with the most liberal voters are also the ones that, strangely, have the most crime, or that weirdly statistics proves that it's not racism, immigrants really do bring crime with them - and they'll just reply that "reality is racist."
This is the same thing. Of course the computer
I guess it depends on what you mean by fair (Score:4, Informative)
Good for them! (Score:2, Funny)
If they want precrime algorithms and AI to ensure no one is smoking or drinking sugary beverages then more power to them, as long as they keep their technocratic dystopia confined to NYC they can do whatever they want.
Cat tongue (Score:4, Insightful)
"Well, the legislature, governor, and city council have to go..."
"Yes. Wait, we haven't even started yet!"
The name for this will be... (Score:5, Funny)
Open Source - Open Data (Score:5, Insightful)
Put in a mandate that all government algorithms most be open sourced in an easily accessible fashion, and all data passed through them must also be easily accessed. This will enable 3rd parties, ANY 3rd party, not just contracted "companies" (usually in the pockets of the people making decisions) to audit the code and data for flaws.
One of the largest issues I've seen in the past with these systems is that they falsely assume correlation = causation. And quite often, the cause and effect are backwards, too. One example I always liked was that "overhead high voltage power lines caused health issues for those that live near them" - when once the data was updated with more inputs, it was discovered that it was an entirely different cause all together. High voltage power lines are unsightly, causing housing values around them to be below the average for the community. Poorer families were buying/renting them. Poorer families are more likely to have health issues due to financial constraints. In the end, the correlation wasn't causation, but each item both shared a similar root cause.
Re: (Score:1, Insightful)
"Put in a mandate that all government algorithms most be open sourced in an easily accessible fashion, and all data passed through them must also be easily accessed. "
Congratulations, you just earned yourself a lifetime ban in all conversations discussing climate change models and the input data used by the climate change models.
Re: (Score:1)
Congratulations, you just sent your own credibility deeper into the gutter:
http://www.realclimate.org/ind... [realclimate.org]
Re: (Score:3)
May not be possible due to releasing personally-identifiable information that should otherwise be kept confidential.
Re: (Score:2)
Re: (Score:2)
Put in a mandate that all government algorithms most be open sourced in an easily accessible fashion, and all data passed through them must also be easily accessed. This will enable 3rd parties, ANY 3rd party, not just contracted "companies" (usually in the pockets of the people making decisions) to audit the code and data for flaws.
This will prevent the use of modern "AI" technology, as the source code produced by layered neural networks is not reasonably comprehensible to humans. Although this isn't a bad price to pay for rooting out biased black boxes.
Also making all data passed through the algorithms public could conflict with legitimate privacy needs, such as with health records.
Algorythem for Consequences (Score:2)
Re: (Score:1)
So what is the solution then to fix the legal system.
Just let the burglar go and drop the charges?
Humans aren't fit to determine Bias in algorithms. (Score:1)
The algorithms deal only in facts and data. Even if they WERE presenting biased results, there is not one human alive, not me, not you, not the almighty assholes in government, nobody, that can assess that bias fairly without throwing their own biases on top of the algorithm for comparison.
You need an algorithm to check the algorithms, but then you're right back to wondering where the bias comes from, the algorithm, or the creator of the algorithm.
And if this is literally just another chance to shove polit
"unbiased" (Score:5, Insightful)
I happen to also be quite skeptical of the legitimacy of disparate impact [wikipedia.org] policy, which states that even if a policy is facially neutral (not imposing rules or criteria associated with protected classes attributes), if it affects one group more than another it may be considered discriminatory or "biased".
While good in theory, I have real trouble about how "unbiased" principles are applied in practice.
Algorithms don't spring fully formed out of aether (Score:3, Insightful)
People decide what variables to put in or not, what data to test on or not. Using an "algorithm" doesn't eliminate subjectivity, it removes it in the sense of setting it at a distance where it is out of people's minds even though it's still there.
The mayor of New York City, Bill de Blasio (Score:1)
Just my 2 cents
The logical outcome of "Disparate Impact" theory (Score:3)
Re: (Score:2)
"a higher-than-normal rejection rate for loans for a group doesn't result in a lower-than-normal default rate on those loans"
If the higher-than-normal rejection rate is correct, it SHOULD lead to a "normal" default rate on loans for that group, not significantly more (too lenient) nor significantly lower (too stringent). Managing risks is the whole point and the risk involved should be at a base level for whatever group you look at.
For those that still don't get it, and say "oh well, the lender should just accept the higher losses for those groups" as the price for Looking Decent in the Eyes of the Public, think again. They're n
Free from bias? (Score:3)
Who is committing the crime? What crimes? What policing method has best reduced that crime rate around the USA?
Make some maps of the city and fund some new police in the areas with crime.
Add some CCTV. Bring in the next gen IMSI-catchers to see who is talking with criminals. A better voice print system?
Start tracking crime down to a street and building level in real time.
Ask the FBI to provide some statistics on who is doing what crime, when and where. Find the predicable patterns for the police to work with.
Todays tech can map most of the problem areas and detect most of the repeat criminals.
The only bias is not funding the police to do their jobs. Stop looking for "bias" and start funding police work.
Re: (Score:2)
You missed it.
The chief of police has limited resources. He looks at crime stats and directs more officers to areas with more crime. Since there are more of them they detect more crime and make more arrests, pushing them stats for that area up even further.
In other words, without understanding what is happening these simplistic assumptions are misleading.
Re: (Score:2)
Some areas have low crime. A car is safe during the day and at night. A home is not getting its door or window opened and property is not getting removed.
A person can wait for a bus, drive their car without having to worry about robbery.
Shop for food without having to consider the risks.
Police can map that crime rate. Move in and track the bad people. Entire areas can be policed with better methods and the crime rate can be reduced.
Get the voice prints so