Aug
20
2:00pm
Identify and remove bias from AI models
By IBM Developer
Fairness in data and machine learning algorithms is critical to building safe and responsible AI systems. Fairness gives you a way to understand the practical implications of deploying the model in a real-world situation.
In this workshop, you will learn how to use a diabetes data set to predict whether a person is prone to have diabetes. You’ll use IBM Watson Studio, IBM Cloud Object Storage, and the AI Fairness 360 Toolkit to create the data, apply the bias mitigation algorithm, then analyze the results.
🎓 What will you learn?
- Create a project using Watson Studio
- Use the AI Fairness 360 Toolkit
👩💻 Who should attend?
- Developers
- Machine learning enthusiasts
- Data scientists
- Anyone who is growing in the field of Data Science & AI
✍🏼 Prerequisites
- Sign up/Login for IBM Cloud:
Register for the event: https://www.crowdcast.io/e/identify-and-remove-bias
🎙️ Speaker(s)
- Asna Javed, Lead Developer Advocate - https://www.linkedin.com/in/asnajaved/
____________________________________________________________________________
By registering for this event, you acknowledge that this video will be recorded and you consent for it to be featured on IBM media platforms and pages and agree to the IBM Developer Terms of Use.
https://developer.ibm.com/terms/ibm-developer-terms-of-use/
hosted by

IBM Developer
share