快,点击上方
蓝字
关注并置顶
这
个公众号,一起涨姿势~
今天的这篇文章可能需要阅读完全文才能get到整体表达的思想。点击文末左下角“阅读原文”即可跳转。
In the late 1970s
he
foretold
the possibility of a scenario
that has been dubbed the Kessler syndrome
: as the density of space rubbish increases, a cascading, self-sustaining
runaway cycle
of debris-generating collisions can arise (that might ultimately make low-Earth orbit too hazardous to support most space activities).
20世纪70年代末,他预测了一种可能发生的情况,该情况被命名为“凯斯勒综合症”:随着太空垃圾密度的增加,
可能会出现层层叠叠、能够自我保持,能够产生碎片的碰撞的失控循环,最终使得近地轨道危险到无法支持大多数太空活动。
cascading和self-sustaining都是形容词,of debris-generating collisions指产生碎片的碰撞的XX,也可以当做形容词部分,所以这里的主语就是runaway cycle。
1.foretell: to say what will happen in the future, especially by using special magical powers
2.scenario: a situation that could possibly happen
3.dub: to give something or someone a name that describes them in some way(配音也是这个单词)
有观点认为这个世界是围绕白人男性设计的,尤其在医疗体制中,在多处体现了人种偏向性,设计系统中无意识或有意识出现白人、男性优先的情况,对社会根本造成潜在危害。
Beyond medicine, there are many examples of this phenomenon in information technology: systems that recognize white faces but not black ones; legal software which recommends harsher sentences for black criminals than white; voice-activated programs that work better for men than women. Even
mundane
things like car seat-belts have often been designed with men in mind rather than women.
除了医学,在信息技术领域也有很多这种现象的例子:只能对白人面部识别而识别不出黑人的系统;建议对黑人犯罪者判处比白人更严厉的刑罚的法律软件;男性比女性更适合使用语音激活程序等。即使是像汽车安全带这样普通的东西,在设计时常常考虑到的也是男性而不是女性。
The origin of such design bias is understandable, if not forgivable. In the West, which is still the source of most innovation, engineers have tended to be white and male. So have medical researchers. That leads to groupthink, quite possibly unconscious, in both inputs and in outputs.
如果你无法原谅这种设计偏向性的初衷,但起码可以理解。在至今仍然是创新之源的西方世界里,工程师往往是白人和男性。医学研究人员也是如此。这就导致了输入
(设计)
和输出
(产品)
很可能是无意识的群体思维。
Input bias is particularly responsible for the IT
cock-ups.
Much of what is commonly called artificial intelligence is actually machine learning. As with any learning, the
syllabus
determines the outcome. Train software on white faces or men’s voices, and you will create a system that is focused on handling them well. More subtle biases
are
also
in play
, though. The faulty medical algorithm used prior medical spending as a proxy for current need. But black Americans spend less on health care than whites, so it discriminated against them.
计算机行业一团糟,输入偏差难辞其咎。人们通说的人工智能,其实很多都是机器学习。与任何种类的学习一样,教学大纲决定了学习成果。用白人的面部信息或男性的声音编辑软件,就会创建出一个专注为白人男性处理信息的系统。然而,更细微的偏向性也在发挥作用。有一个错误的医疗算法,使用以前的医疗支出来代表(测算)当前的需求。然而美国黑人在医疗保健上的花费比白人少,所以这是对他们的歧视。
Input bias is also a problem in medicine. Despite decades of rules on the matter, clinical trials are still overloaded with white men. As far as sex bias is concerned, trial designers do have half a point. If a female participant became pregnant and the treatment under test harmed her baby, that would be tragic. But there is no excuse for failing to make trials big enough to detect statistical differences between relevant groups.
输入偏向性也是医学领域的一个问题。尽管在这个问题上已经花费数十年来制定规则,但临床试验的受试者几乎依然是白人男性。就性别偏差而言,试验设计者确实也有一定的道理。如果一名女性受试者怀孕了,而实验疗法伤害了她的孩子,那将造成悲剧。但还是没有理由不去扩大实验规模,检测相关群体的统计差异。
Output bias is more intriguing. In a well-ordered market, competition should introduce diversity quite fast.
Look to those who buy medical equipment, and you may see a
mix
that is more white and male than the population in hospital
wards
and doctors’ waiting rooms.
Neither are face-recognition systems or sentencing software bought by those who suffer because of their failures.
输出偏向性更有趣。在有序市场中,竞争会很快带来多样性。
翻译划线句,长按文末小程序码打卡,答案下期公布~
面部识别系统或量刑软件也不是那些因自己的(不良)失败行为而自作自受的人购买的。
本文节选自:The Economist(经济学人)
发布时间:2021.04.09
作者:Leaders