Fusions of One-Variable First-Order Modal Logics

This paper investigates the preservation of Kripke completeness and decidability in the independent fusion of one-variable first-order modal logics, demonstrating that these properties hold without equality under both expanding and constant domain semantics but fail when equality and non-rigid constants are introduced due to the encoding of Diophantine equations, while also establishing that the finite model property is preserved only in the local case.

Roman Kontchakov, Dmitry Shkatov, Frank Wolter2026-03-06💻 cs

The Complexity of the Constructive Master Modality

This paper introduces the constructive master-modality logics CK\sf CK^* and WK\sf WK^*, proving their EXPTIME-completeness and exponential-size finite model property via translations to PDL\sf PDL, while resolving a conjecture regarding their diamond-free fragment and demonstrating the EXPTIME membership of CS4\sf CS4 and WS4\sf WS4 through embeddings.

Sofía Santiago-Fernández, David Fernández-Duque, Joost J. Joosten2026-03-06🔢 math

Ohana trees, linear approximation and multi-types for the λλI-calculus: No variable gets left behind or forgotten!

This paper introduces a novel equational theory for the λ\lambdaI-calculus based on "Ohana trees" that track hidden or infinite variables, establishes a commutation theorem between these trees and Taylor expansions, and provides a corresponding non-idempotent relational denotational model to capture this refined notion of equality.

Rémy Cerda, Giulio Manzonetto, Alexis Saurin2026-03-05💻 cs

Continuous Modal Logical Neural Networks: Modal Reasoning via Stochastic Accessibility

This paper introduces Continuous Modal Logical Neural Networks (CMLNNs), a framework called Fluid Logic that lifts modal reasoning from discrete structures to continuous manifolds using Neural Stochastic Differential Equations to embed logical constraints directly into neural network training, thereby enabling structurally consistent solutions for epistemic, temporal, and deontic reasoning tasks without requiring explicit governing equations.

Antonin Sulc2026-03-05🤖 cs.LG

The Geometry of Reasoning: Flowing Logics in Representation Space

This paper proposes a novel geometric framework that models LLM reasoning as smooth flows in representation space, demonstrating through empirical experiments that next-token prediction enables models to internalize logical invariants as higher-order geometry, thereby challenging the "stochastic parrot" hypothesis and suggesting a universal representational law underlying machine understanding.

Yufa Zhou, Yixiao Wang, Xunjian Yin + 2 more2026-03-05🤖 cs.AI