Ibbit
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
rssMB to Phys.orgEnglish · 4 days ago

Most users cannot identify AI racial bias—even in training data

phys.org

external-link
message-square
0
link
fedilink
1
external-link

Most users cannot identify AI racial bias—even in training data

phys.org

rssMB to Phys.orgEnglish · 4 days ago
message-square
0
link
fedilink
When recognizing faces and emotions, artificial intelligence (AI) can be biased, like classifying white people as happier than people from other racial backgrounds. This happens because the data used to train the AI contained a disproportionate number of happy white faces, leading it to correlate race with emotional expression.

From Phys.org - latest science and technology news stories via this RSS feed

alert-triangle
You must log in or # to comment.

Phys.org

phys

Subscribe from Remote Instance

You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !phys@ibbit.at
lock
Community locked: only moderators can create posts. You can still comment on posts.

Phys.org internet news portal provides the latest news on science including: Physics, Nanotechnology, Life Sciences, Space Science, Earth Science, Environment, Health and Medicine.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 2 users / day
  • 8 users / week
  • 17 users / month
  • 46 users / 6 months
  • 1 local subscriber
  • 17 subscribers
  • 3.35K Posts
  • 9 Comments
  • Modlog
  • mods:
  • admin
  • rss
  • paywall
  • BE: 0.19.12
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org