Explore the critical importance of interpretable machine learning in high-stakes decision-making through this 46-minute conference talk by Professor Cynthia Rudin from Duke University. Delve into the fundamental problem of optimal scoring systems, examining their history, design, and practical applications in healthcare and criminal justice. Learn about the first practical algorithm for building optimal scoring systems from data, and understand the societal consequences of using black box models. Discover the advantages of interpretable models over black box approaches, particularly in areas like bail decisions, healthcare, and finance. Examine case studies, including the COMPAS recidivism prediction tool, and gain insights into the ethical implications of AI-driven decision-making. Engage with cutting-edge research on risk scores, recidivism prediction, and seizure probability assessment in hospitalized patients.
Scoring Systems - At the Extreme of Interpretable Machine Learning - Cynthia Rudin - Duke University