Fusion Complexity Inversion: Why Simpler Cross View Modules Outperform SSMs and Cross View Attention Transformers for Pasture Biomass Regression
This study demonstrates that for pasture biomass regression on scarce agricultural data, prioritizing high-quality backbone pretraining and utilizing simple local fusion modules significantly outperforms complex global architectures like SSMs and cross-view attention transformers, a phenomenon termed "fusion complexity inversion."