On September 8, 2021, Women’s World Banking hosted a virtual panel discussion on “Using AI to Develop Gender Sensitive Solutions” as part of its Making Finance Work for Women Thought Leadership Series.
Moderated by Janet Truncale, Vice Chair and Regional Managing Partner of EY’s Americas Financial Services Organization, the panel included the following recognized experts: Claudia Juech, Vice President of Data and Society at the Patrick J. McGovern Foundation; Harshvardhan Lunia, Co-Founder and CEO of LendingKart; and Pavel Vyhnalek, Private Equity and Venture Capital Investor and former CEO of Home Credit Asia. The panel also featured opening remarks by Christina Maynes, Senior Advisor for Market Development, Southeast Asia at Women’s World Banking, and closing remarks by Samantha Hung, Chief of the Gender Equality Thematic Group at the Asian Development Bank.
AI and Women’s Financial Inclusion
Artificial intelligence (AI) and machine learning (ML) have revolutionized the financial services industry. Considering the implications of this shift, the panel addressed how these disruptions can drive women’s financial inclusion and economic empowerment, as well as the potential risks of leveraging AI and ML to advance inclusivity.
Artificial intelligence and machine learning hold enormous potential for low-income women in emerging markets. Thanks in large part to affordable smartphones and low-cost data plans, women are becoming data-rich individuals, and their digital footprints are allowing them greater access to credit and at better terms. For “thin-file” women customers (those lacking credit history information), the traditional data used to establish a customer’s credit worthiness—such as the customer’s salary or assets—can be discriminatory, resulting in smaller loans or perhaps none at all. However, alternative data offers financial service providers with another set of criteria by which to determine a customer’s credit worthiness. A plethora of data collected, ranging from an individual’s utilities and telecoms payment history to her e-commerce and social media footprint, can help open up new credit to women.
Tackling Gender Bias and Privacy
Although AI and ML capabilities bear much promise in terms of driving financial inclusion, the panel noted that gender bias does exist and can leave women disadvantaged or deprioritized. For example, if a data sample set does not adequately represent women, neither will the output of AI and ML models. Additionally, the biases of individuals, perpetuated by societal and cultural norms, can manifest in the actual algorithms and data sets on which they work. As more financial service providers invest in AI and ML capabilities, the panel emphasized the need for women to be actively involved in the development of AI-enabled products and services to help combat gender bias, noting that too few women are in or pursue data science careers. Panelists further stressed the importance of greater female representation at all levels of the financial services industry.
Amid increasingly personalized AI, privacy and security concerns have also risen, and panelists underscored the importance of balancing data access with privacy interests; for instance, by disallowing access to their data, customers may put themselves at a disadvantage in generating alternative data for credit scoring. Panelists agreed, though, that obtaining customer consent is critical for all financial service providers utilizing AI and ML.
Ongoing Efforts
As part of the panel event, Sonja Kelly, Director of Research & Advocacy at Women’s World Banking, highlighted some of the organization’s initiatives focused on gender-smart credit scoring. In partnership with LendingKart and Data.org—a collaboration between the Mastercard Center for Inclusive Growth and the Rockefeller Foundation—Women’s World Banking is working make credit available to women entrepreneurs by increasing representation in data pipelines and ensuring algorithms are fair to women applicants. Women’s World Banking has also created an interactive toolkit using a synthetic data set, by which financial service providers can detect and mitigate gender biases in credit score models; further information can be found in the report Algorithmic Bias, Financial Inclusion, and Gender, released in February 2021.
Aimed at driving action towards greater women’s economic empowerment, Making Finance Work for Women provides a critical platform for stakeholders and thought leaders in the financial inclusion sector to engage on key issues. The series also showcases Women’s World Banking’s research, expertise, and upcoming projects. For more information on the series and upcoming events, please visit the website.