Keywords: Temporal Rule Learning
Abstract: We propose a novel differentiable neural architecture for learning first-order temporal logic rules enriched with metric
operators. Leveraging differentiable immediate consequence operators over data, we extend the approach to temporal data
by learning both the predicates and the temporal intervals in which they hold. Among the strengths of our model are its
support of existential literals in rule bodies to express eventualities within an interval and its seamless applicability to
data over both discrete and dense time intervals. Notably, our model can effectively capture temporal dependencies without
reifying all possible timestamps and produces a linear number of rules in the size of the training set, which has a benign
effect on model complexity and scalability. We explore different use cases and show in experiments the benefits of our
approach, highlighting its potential as a scalable solution for interpretable metric temporal rules over data.
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Submission Number: 24723
Loading