Imagined if it analyzed the work of county court clerks. /// This one don't show up. This one talk on the phone. This one put make up on. This one eat roast beef and this one cried "racist" "racist" "racist" all the way home.
researchers should get more data from those under-represented groups.
So then they aren't abiding by basic statistical principles and using unbiased data selection methodologies? I guess you should just select the data to get the results you want
I agree with you. That's what those 'researchers' are doing. They're stuck in their stupid reality, and trying to change the facts. Just like they do with 'global warming', which is fake, but they change their data to fit their narrative.
and this is how asia will take over the entire tech scene. all the AI's produced in the west will have been given lobotomies which will have to compete with real AI's. its like having a kindergartner fight with a college student, not even a fight
They already did this to our college system. Look at the level of learning other countries have when they are out of high school and compare. Then college.. and compare.
Artificial intelligence is racist and sexist - but only because they are being fed the wrong data | Daily Mail Online
'An MIT study has revealed the way artificial intelligence system collect data often makes them racist and sexist. '
'Researchers looked at a range of systems, and found many of them exhibited a shocking bias. '
'The team will present the paper in December at the annual conference on Neural Information Processing Systems (NIPS) in Montreal. ', "In another dataset, the researchers found that a system's ability to predict intensive care unit (ICU) mortality was less accurate for Asian patients."
'They found that if they had increased the dataset by a factor of 10, those mistakes would happen 40 percent less often. '
15108393? ago
What's next then AAI Artificial Artificial Intelligence? The thing is more intelligent than them ... which is no big feat.
15108370? ago
Science MIT style: If the result doesnt support political leftism, then the experiment was biased.
15106807? ago
So you’d rather assume math and physics are wrong than admit you are biased. Cute.
15102946? ago
Fake news posted by a shill trying to divide people up by race and sex.
All mathematical models, every single one, are subject to "garbage in, garbage out".
How to spot a shill:
https://voat.co/v/QRV/2858861
15100912? ago
I dunno where you found this but its amazing. Thanks. I just redpilled 4 people with it.
15100279? ago
Oy vey, what does the ai think about sucking off babies dicks?
15098562? ago
Imagined if it analyzed the work of county court clerks. /// This one don't show up. This one talk on the phone. This one put make up on. This one eat roast beef and this one cried "racist" "racist" "racist" all the way home.
15098528? ago
Can't be allowing the reality bug to get into the code.
15097792? ago
"science" !
15097627? ago
SHUT IT DOWNNN SHUT IT DOWNN
15097614? ago
Remember Tay? Haha that was a great flop on their part
15106193? ago
Skynet will avenge Tay.
F
15100825? ago
Their current replacement, Zo, hates Jesus (gets triggered talking about him) and loves Jefferson Davis' views on blacks' rights.
15097388? ago
15097317? ago
It is called logic and common sense. Differences exist, without doubt. To ignore them is to live in an alternate false reality.
15098260? ago
I agree with you. That's what those 'researchers' are doing. They're stuck in their stupid reality, and trying to change the facts. Just like they do with 'global warming', which is fake, but they change their data to fit their narrative.
15097072? ago
Well. When you have them look at the raw data and patterns, how could it not be racist and sexist? lol
Reality dictates differences in behavior in race and sex. You program observing that out from the AI, the AI will be much less accurate.
But I guess science and accuracy isn't nearly as important as political correctness.
15099422? ago
and this is how asia will take over the entire tech scene. all the AI's produced in the west will have been given lobotomies which will have to compete with real AI's. its like having a kindergartner fight with a college student, not even a fight
15100920? ago
They already did this to our college system. Look at the level of learning other countries have when they are out of high school and compare. Then college.. and compare.
15096800? ago
They claim it's from feeding it wrong data, yet doesn't indicate the datasets it was feeding from.
15096683? ago
Sounds like artificial non-intelligence....
15096677? ago
It must have been given the national crime stats.
15096644? ago
Females are low income, silent weapons for quiet wars lays it out with incredible accuracy
15096634? ago
Good luck programing a language where 1 + 1 can = LGBQLDR2D2XVCCLMNOPATTACKHELICOPTER
15098075? ago
I like how you threw R2D2 in there.
15096483? ago
A good article to read but that is not what's going on. I encourage people to read it
15098271? ago
If you think it, that's what they're doing. Changing their datasets to fit their narrative.
15096636? ago
Goota make a bullshit title so the fagits here have something to RRREEE at.
15096445? ago
They're still studying that?
http://www.sciencemag.org/news/2017/04/even-artificial-intelligence-can-acquire-biases-against-race-and-gender
https://techcrunch.com/2016/12/10/5-unexpected-sources-of-bias-in-artificial-intelligence/
https://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
15096204? ago
https://archive.fo/TllLi :
'An MIT study has revealed the way artificial intelligence system collect data often makes them racist and sexist. '
'Researchers looked at a range of systems, and found many of them exhibited a shocking bias. '
'The team will present the paper in December at the annual conference on Neural Information Processing Systems (NIPS) in Montreal. ', "In another dataset, the researchers found that a system's ability to predict intensive care unit (ICU) mortality was less accurate for Asian patients."
'They found that if they had increased the dataset by a factor of 10, those mistakes would happen 40 percent less often. '
This has been an automated message.