论文
International Journal of Geographical Information Science
PublisherJournal
中文标题
GeoXCP:可解释人工智能中空间解释的不确定性量化
English Title
GeoXCP: uncertainty quantification of spatial explanations in explainable AI
Xiayin Lou Peng Luo Ziqi Li Song Gao Liqiu Meng a Chair of Cartography and Visual Analytics, Technical University of Munich, Munich, Germanyb Senseable City Lab, Massachusetts Institute of Technology, Cambridge, MA, USAc Department of Geography, Florida State University, FL, USAd The Spatial Data Science Center, Florida State University, FL, USAe Department of Geography, University of Wisconsin-Madison, Madison, WI, USAXiayin Lou is a PhD candidate in Chair of Cartography and Visual Analytics at Technical University of Munich. He holds a Master of Science in Cartography and GIS at Peking University. His research focuses on GeoAI, geographic bias, uncertainty-aware explainable spatial modelling, visual analytics and urban study. Email: [email protected] Luo is a Postdoctoral Research Fellow at MIT. He holds a Ph.D. from the Chair of Cartography and Visual Analytics at the Technical University of Munich. His research centers on Spatial data science and Trustworthy GeoAI. Email: [email protected] Li is an Assistant Professor in quantitative geography at Florida State University. His research focuses on the methodological development of spatially explicit and interpretable statistical/machine learning models to investigate human behavior across space and place. He is one of the primary developers of Multi-scale Geographically Weighted Regression (MGWR) and Python Spatial Analysis Library (PySAL). Email: [email protected] Gao is an associate professor in GIScience at the Department of Geography, University of Wisconsin-Madison. He holds a Ph.D. in Geography at the University of California, Santa Barbara. His main research interests include GeoAI, geospatial data science, spatial networks, human mobility and social sensing. Email:[email protected] Meng is a professor of Cartography at the Technical University of Munich, and a member of German National Academy of Sciences. She is serving as Vice President of the International Cartographic Association. Her research interests include geodata integration, mobile map services, multimodal navigation algorithms, geovisual analytics, and ethical concerns in social sensing. Email: [email protected].
发布时间
2025/10/27 16:26:07
来源类型
journal
语言
en
摘要
中文对照

理解与解释复杂的地理现象——从气候变化到社会经济不平等——是地理学及更广泛科学界的核心关注点。已有多种方法用于阐明变量间的关系,从线性回归模型中的系数估计到可解释人工智能(XAI)技术中日益主导的特征归因得分。然而,XAI方法生成的解释通常带有不确定性,这种不确定性源于模型本身及其训练数据。尽管考虑此类不确定性至关重要,但在地理空间领域这一问题仍被普遍忽视。在本研究中,我们基于置信预测开发了一种XAI解释的不确定性量化框架,称为地学解释置信预测(GeoXCP)。通过将空间依赖性纳入建模过程,GeoXCP生成了具有校准不确定性估计的空间自适应解释。我们通过大量模拟实验和真实世界数据集验证了GeoXCP的有效性。结果表明,GeoXCP在多种地理空间场景下均能提供可靠解释,并有效量化不确定性。我们的方法在可解释地理空间机器学习领域实现了重要进展,使决策者能够更好地评估模型驱动洞察的可信度。所提出的框架已实现为名为GeoXCP的Python工具包。

English Original

Understanding and explaining complex geographic phenomena—ranging from climate change to socioeconomic disparities—is a central focus in both geography and the broader scientific community. Various methods have been developed to elucidate relationships between variables, from coefficient estimates in linear regression models to the increasingly dominant use of feature attribution scores in Explainable AI (XAI) techniques. However, explanations generated by XAI methods often carry uncertainty, stemming from the model itself and the data used to train the model. Despite the critical importance of accounting for such uncertainty, this issue remains largely overlooked in the geospatial domain. In this study, we developed an uncertainty quantification framework for XAI explanations based on conformal prediction, termed Geospatial eXplanation Conformal Prediction (GeoXCP). By incorporating spatial dependence into the modeling process, GeoXCP produced spatially adaptive explanations with calibrated uncertainty estimates. We validated the effectiveness of GeoXCP through extensive simulation experiments and real-world datasets. The results demonstrated that GeoXCP provided reliable explanations while effectively quantifying uncertainty across diverse geospatial scenarios. Our approach represented a significant advancement in explainable geospatial machine learning, enabling decision-makers to better assess the trustworthiness of model-driven insights. The proposed framework was implemented in a python package, named GeoXCP.

元数据
来源International Journal of Geographical Information Science
类型论文
抽取状态raw
关键词
PublisherJournal