A machine learning model makes predictions of an outcome for a particular instance. (Given an instance of a loan application, predict if the applicant will repay the loan.) The model makes these predictions based on a training dataset, where many other instances (other loan applications) and actual outcomes (whether they repaid) are provided. Thus, a machine learning algorithm will attempt to find patterns, or generalizations, in the training dataset to use when a prediction for a new instance is needed. (For example, one pattern it might discover is "if a person has salary > USD 40K and has outstanding debt < USD 5, they will repay the loan".) In many domains this technique, called supervised machine learning, has worked very well.
However, sometimes the patterns that are found may not be desirable or may even be illegal. For example, a loan repay model may determine that age plays a significant role in the prediction of repayment because the training dataset happened to have better repayment for one age group than for another. This raises two problems: 1) the training dataset may not be representative of the true population of people of all age groups, and 2) even if it is representative, it is illegal to base any decision on a applicant's age, regardless of whether this is a good prediction based on historical data.
AI Fairness 360 is designed to help address this problem with fairness metrics and bias mitigators. Fairness metrics can be used to check for bias in machine learning workflows. Bias mitigators can be used to overcome bias in the workflow to produce a more fair outcome.
When the reader has completed this Code Pattern, they will understand how to:
- Compute a fairness metric on original data using AI Fairness 360
- Mitigate bias by transforming the original dataset
- Compute fairness metric on transformed training dataset
- User interacts with Watson Studio to create a Jupyter notebook
- Notebook imports the AIF360 toolkit.
- Data is loaded into the notebook.
- User runs the notebook, which uses AIF360 tookit to assess fairness of Machine Learning model.
- Jupyter Notebook: An open source web application that allows you to create and share documents that contain live code, equations, visualizations, and explanatory text.
- Tensorflow: An open source software library for numerical computation using data flow graphs.
- Watson Studio: Analyze data using RStudio, Jupyter, and Python in a configured, collaborative environment that includes IBM value-adds, such as managed Spark.
- Artificial Intelligence: Artificial intelligence can be applied to disparate solution spaces to deliver disruptive technologies.
- Data Science: Systems and scientific methods to analyze structured and unstructured data in order to extract knowledge and insights.
- Python: Python is a programming language that lets you work more quickly and integrate your systems more effectively.
Either run locally:
or in Watson Studio:
then:
Clone the https://github.com/IBM/ensure-loan-fairness-aif360 locally. In a terminal, run:
$ git clone https://github.com/IBM/ensure-loan-fairness-aif360
The code included in this Code Pattern runs in a Jupyter Notebook.
-
Start your Jupyter Notebooks. Starting in your
ensure-load-fairness-aif360
cloned repo directory will help you find the notebook and the output as described below. Jupyter Notebooks will open in your browser.cd ensure-load-fairness-aif360 jupyter notebook
-
Navigate to the
notebooks
directory and open the notebook file namedcredit_scoring.ipynb
by clicking on it.
Sign up for IBM's Watson Studio. By creating a project in Watson Studio a free tier Object Storage
service will be created in your IBM Cloud account. Take note of your service names as you will need to select them in the following steps.
Note: When creating your Object Storage service, select the
Free
storage type in order to avoid having to pay an upgrade fee.
- In Watson Studio, click on
Create notebook
to create a notebook. - Create a project if necessary, provisioning an object storage service if required.
- In the
Assets
tab, select theCreate notebook
option. - Select the
From URL
tab. - Enter a name for the notebook.
- Optionally, enter a description for the notebook.
- Enter this Notebook URL:
https://raw.githubusercontent.com/IBM/ensure-loan-fairness-aif360/master/notebooks/credit_scoring.ipynb
- Click the
Create
button.
-
Use the menu pull-down
Cell > Run All
to run the notebook, or run the cells one at a time top-down using the play button. -
As the cells run, watch the output for results or errors. A running cell will have a label like
In [*]
. A completed cell will have a run sequence number instead of the asterisk.
See examples/example_notebook.ipynb
:
- AI Fairness 360 Toolkit (AIF360)
- Contact AIF360 team on Slack
- IBM launches tools to detect AI fairness, bias and open sources some code
- Artificial Intelligence Code Patterns: Enjoyed this Code Pattern? Check out our other AI Code Patterns.
- Data Analytics Code Patterns: Enjoyed this Code Pattern? Check out our other Data Analytics Code Patterns
- AI and Data Code Pattern Playlist: Bookmark our playlist with all of our Code Pattern videos
- With Watson: Want to take your Watson app to the next level? Looking to utilize Watson Brand assets? Join the With Watson program to leverage exclusive brand, marketing, and tech resources to amplify and accelerate your Watson embedded commercial solution.
- Data Science Experience: Master the art of data science with IBM's Data Science Experience