从超音速湍流到宇宙大尺度结构,复杂的物理系统均由连续的多尺度动力学所支配。尽管现代机器学习架构在映射此类系统的高维可观测量方面表现优异,但尚不清楚其是否内化了基本物理定律,抑或仅在离散统计相关性之间进行插值。标准可解释人工智能(XAI)架构——尤其是基于扰动和梯度显著性的方法——依赖于像素级扰动,这会产生非物理的人工伪影,并将输入推离有效的经验分布。为解决此问题,我们提出一种由约束扩散分解(Constrained Diffusion Decomposition, CDD)驱动的诊断框架;CDD 是一种基于扩散的多尺度数据分解算法,支持通过尺度感知的修改实现物理约束下的数据生成与模型评估。我们将该框架应用于去噪扩散概率模型(Denoising Diffusion Probabilistic Model, DDPM),在连续的、基于 CDD 的尺度空间中执行确定性干预。结果表明,在适度的物理扰动下,无约束生成模型表现出局域结构冻结与非线性失稳,而非类似偏微分方程(PDE)的连续响应。该网络无法维持跨尺度连续性,导致生成轨迹在进入未见物理状态时发生偏离。通过合成一系列物理自洽的状态 continuum,这一尺度感知方法构建了一个受控测试环境,用于评估算法脆弱性,并为未来架构尊重自然宇宙中多尺度因果关系提供了严格的物理约束基础。
Complex physical systems, from supersonic turbulence to the macroscopic structure of the universe, are governed by continuous multiscale dynamics. While modern machine learning architectures excel at mapping the high-dimensional observables of these systems, it remains unclear whether they internalize the governing physical laws or merely interpolate discrete statistical correlations. Standard Explainable AI (XAI) architectures, particularly perturbation-based and gradient-saliency methods, rely on pixel-wise perturbations, which generate unphysical artifacts and push inputs off the valid empirical distribution. To resolve this, we introduce a diagnostic framework driven by Constrained Diffusion Decomposition (CDD), a diffusion-based multiscale data decomposition algorithm that enables physically constrained data generation and model evaluation via scale-aware modifications. Applying this framework to a Denoising Diffusion Probabilistic Model (DDPM), we execute deterministic interventions directly within the continuous, CDD-based scale space. We demonstrate that under moderate physical perturbations, the unconstrained generative model exhibits localized structural freezing and non-linear instability rather than continuous PDE-like responses. The network fails to maintain cross-scale continuity, causing the generative trajectory to diverge when pushed into unseen physical states. By synthesizing a continuum of physically coherent states, this scale-informed methodology establishes a controlled test ground to evaluate algorithmic vulnerabilities, providing the rigorous physical constraints necessary for future architectures to respect the multiscale causality of the natural universe.