Papers
Equivalence and Theory Expansion | Synthese, 2025
This paper presents a novel framework for understanding theoretical equivalence that reconciles two familiar approaches to the problem: formal and content-based approaches. Formal approaches are based on the logical and syntactical features of theories, while content-based approaches focus on their content as construed in various metaphysical approaches to semantics. I argue that these approaches are complementary and that a deeper view of equivalence emerges when we consider a theory’s expansion potential—its capacity to be embedded in broader theoretical contexts. This notion links content to syntax, as syntactical structure constrains how a theory’s expressions can be used, and this use in turn determines the theory’s possibilities for representing the world across different theoretical contexts. The framework is applied to various cases where formal and content-based approaches face difficulties, showing that it can explain the equivalence and inequivalence of theories where previous approaches failed, and that it can better justify the strategies employed by proponents of these approaches in each case.
Use, Content, and Interpretation | Work in Progress
This paper develops a framework that bridges the gap between two prominent kinds of approach to theoretical equivalence, interpretation, and semantics: content-based and formal. Content-based approaches typically conceive the process of theory interpretation as one of mapping sentences onto pre-formed semantic objects. This view, however, faces significant objections, notably Dewar’s (2023) justified observation that interpretation is often implicitly treated as a simplistic matter of “sticking” pre-existing semantic contents onto expressions— an analogy I call “sticker-book semantics.” Conversely, purely formal approaches, which rely exclusively on syntactic or model-theoretic criteria, struggle with cases of formal underdetermination: scenarios where formally identical theories represent distinct worldly facts. In response, I propose the “filtering lens” framework, a metasemantic account putting community use in the forefront as constraining interpretation. Although the central insight—that linguistic and representational use shapes meaning—is broadly recognized, systematically formalizing it sheds light on issues concerning theoretical equivalence and interpretation. In particular, I illustrate how this framework refines formal criteria like definitional and Morita equivalence by use-based constraints, provides accounts of theory expansion and revision, and provides support for alternative accounts of quantification in first- and second-order logic.
Subsystems as Aspects | Work in Progress
This paper bridges two seemingly unrelated ideas: (1) a metaontological framework for talking about “portions of reality” without committing to a specific object-property analysis of these “portions”, and (2) David Wallace’s subsystem-recursive view of physical theories. Metaontologists debate whether mereological disputes over composite objects are merely verbal (Hirsch, 2005) or reflect genuine disagreements about how the world is (Sider, 2009). Both sides often pose this debate as being about “whether there is an objectively correct way to carve portions of reality into objects and properties”, but they treat this “portion” talk as merely metaphorical. I show how we can make it precise by interpreting portions of reality as aspects of facts, as understood in Rayo (2025). The resulting formalism is neutral with respect to any particular object-property analysis of these “portions”. Unexpectedly, this same formalism also yields an interpretation of Wallace’s (2022a) subsystem structures by identifying each physical subsystem in a subsystem structure with a corresponding aspect, thereby supporting the thesis that the physical treatment of subsystems can remain neutral in exactly the same way, which provides fresh support to structural realist positions defended in the philosophy of physics (Ladyman and Ross, 2007; Wallace, 2022b).
Logical Consequence and Explicitness: A Case Study | Work in Progress
In his article “Higher-Order Logic Reconsidered” Jané argues that second-order canonical consequence is inadequate to axiomatize set theory since it fails to meet a requirement he calls non-interference. Although Jané does not explicitly define this criterion, he provides examples in which it is clearly violated. These give us a rough idea of what this requirement might be and of the reason it is demanded of the background logic of an axiomatic theory. Nevertheless, the inadequacies exhibited in Jané’s examples are significantly different than the case of second-order consequence. In this essay I will consider an alternative diagnosis on what goes wrong in these examples and argue that second-order consequence steers clear of the specific problems exhibited by the examples in question.
Conference Presentations
Isn’t the Physics of Spacetime Metaphysics Enough? | SMS 2025 • USI, Lugano
Recent debates about substantivalism in General Relativity (GR) have unfolded along two partially disjoint paths. On one hand, metaphysical approaches ask whether spacetime is truly independent from matter in modal or grounding terms, often dismissing the formal apparatus of GR as a mere tool for representing possibilities. On the other hand, formal approaches rely on category theory and related mathematics to argue for or “block” conclusions about space-time’s metaphysics without, ostensibly, invoking additional metaphysical assumptions. This talk aims to bridge these perspectives by urging that GR be seen as an interpreted theory. In doing so, it already prescribes a way to describe which possibilities it allows: once one fixes a model and an interpretive context, many “metaphysical” questions are effectively answered within GR. At the same time, purely formal results—like dualities between manifold-based GR and Einstein Algebras—do not, on their own, establish physical equivalence or “same facts” claims without further interpretive or metaphysical premises. Thus, both views err if they disregard GR’s built-in interpretive content: metaphysicians sometimes seek external descriptions that replicate what GR already specifies, while formalists presume to “block” disputes by mathematics alone, overlooking that category-theoretic claims become substantive only through interpretation.
Equivalence, Use and Content | 2024 APA Pacific Division Meeting
I develop a framework for thinking about the problem of theoretical equivalence that takes into account how the use of a theory for representational purposes constrains its possible interpretations. I then show how this framework allows us to justify and evaluate equivalence claims by appealing to claims about use.
Equivalence and Theory Expansion | 2024 APA Central Division Meeting
I frame the issue of theoretical equivalence as one involving the equivalence of theories formulated in interpreted languages. I then develop an account of how theories interact when they are jointly adopted and a criterion for evaluating whether two theories are equivalent based on this account. The upshot will be that commonly used criteria for equivalence, which are based on inter-translatability and model isomorphism, may not be sufficient for equivalence.
Using ChatGPT to facilitate assignment feedback and evaluation | 2024 AAPT-APA Teaching Hub
(Co-authored with Bram Vaassen)
A persistent challenge in teaching is providing timely and constructive feedback on student assignments—especially under time constraints. In this interactive session, we presented two complementary approaches to integrating ChatGPT into grading and feedback workflows. First, we showcased a Python-based tool for large-scale grading of undergraduate philosophy essays using GPT-4 and rubric-based evaluation, highlighting its potential and its limitations—particularly the weak correlation with human-assigned grades. Second, we discussed a more precise, rubric-anchored method developed by Satish Strömberg, which achieved strong alignment with teacher grades in smaller, coarser-grained contexts. We also explored ways ChatGPT can support early-stage writing feedback (e.g., identifying grammar issues or converting notes into paragraphs), while noting the epistemic challenges students face in filtering its suggestions. The session concluded with a hands-on workshop segment and an open discussion of ethical questions concerning AI’s role in education—especially regarding teacher authority, bias, and the division of cognitive labor between instructors and LLMs.
Logical Consequence and Explicitness: A Case Study | 2023 APA Pacific Division Meeting
In his article “Higher-Order Logic Reconsidered” Jané argues that second-order canonical consequence is inadequate to axiomatize set theory since it fails to meet a requirement he calls non-interference. Although Jané does not explicitly define this criterion, he provides examples in which it is clearly violated. These give us a rough idea of what this requirement might be and of the reason it is demanded of the background logic of an axiomatic theory. Nevertheless, the inadequacies exhibited in Jané’s examples are significantly different than the case of second-order consequence. In this essay I will consider an alternative diagnosis on what goes wrong in these examples and argue that second-order consequence steers clear of the specific problems exhibited by the examples in question.
Mathematical Competence and Rule-Following | VII International Wittgenstein Congress • PUCP, 2019
(with Eduardo Villanueva)
What must a rational agent grasp in order to competently use the arithmetic sign ‘+’? A natural answer is: the rule governing its application. But as Wittgenstein emphasized in the Philosophical Investigations and the Remarks on the Foundations of Mathematics, no finite pattern of usage—nor any explanation—can uniquely determine a rule, since any course of action can be made to accord with these. This is the heart of the rule-following problem. This talk examines how different ways of characterizing mathematical competence respond to this challenge. Drawing on David Marr’s distinction between the functional, algorithmic, and implementational levels of analysis, we suggest that each level supports a different notion of what it is to “follow a rule.” But rather than solving Wittgenstein’s problem, these levels reveal different ways in which the problem arises.
Trivialist Epistemology and False Mathematical Beliefs | Rutgers-Columbia Undergraduate Conference • Rutgers—New Brunswick, 2019
In The Construction of Logical Space, Agustín Rayo develops an interesting stance on the philosophy of mathematics called Trivialist Platonism. According to his view, mathematical truths are trivial: they require nothing of the world to be true. Thus, pure mathematical sentences carry no information about the world. This leaves the Trivialist with the task of explaining the following apparently obvious facts: that we often lack certain mathematical knowledge, that we learn mathematics, and that we can be wrong about it. Rayo attempts to explain these phenomena using a metalinguistic strategy coupled with a fragmentarist framework. As such, the strategy is subject to the standard criticism against metalinguistic explanations of mathematical knowledge. In this article I will briefly review this account of mathematical knowledge and then proceed to discuss a modification to the fragmentarist framework in order to account for necessarily false beliefs. Afterwards, I will characterize a group of dispositions which will be called inferential dispositions, argue that they have a central role in explaining the language-independent character of mathematical beliefs, and show how accepting the proposed modification will be crucial if the Trivialist wants to account for these dispositions.
Mysticism in Late Wittgenstein’s Philosophy | XIV Philosophy Student Symposium • PUCP, 2018
This talk explores the role of the mystical in Wittgenstein’s Philosophical Investigations and its relation to the earlier saying/showing distinction from the Tractatus Logico-Philosophicus. While the mystical is central to the early work, its place in Wittgenstein’s later thought is more obscure. I examine whether the later rejection of the pictorial theory of meaning entails a rejection of the saying/showing distinction, or whether a reinterpreted version survives within Wittgenstein’s account of language as rule-governed practice. Drawing on remarks by Wittgenstein and scholars like Fann, I assess whether the mystical retains a role in delimiting the bounds of sense in the later philosophy, and how this connects with Wittgenstein’s avowedly religious perspective on philosophical problems.
Postulational Possibility and Metaphysical Realism | XIII Philosophy Student Symposium • PUCP, 2017
This talk examines how metaphysical realism is challenged by Kit Fine’s account of quantification in Relatively Unrestricted Quantification, based on the idea of indefinite extensibility and the notion of postulational possibility. Realism is typically formulated as the claim that there is a definite answer to what exists—often grounded in the existence of a universal domain. But Fine’s expansionist view suggests that any domain of quantification can be extended, undermining the realist’s ability to articulate their thesis in familiar terms. We explore how this affects the linguistic resources available to the realist and assess whether a version of realism can still be coherently formulated if Fine’s argument is accepted.
Inconsistent Beliefs, Inference and Fragmentation | IV Analytic Philosophy Student Meeting • San Marcos National University, 2015
This talk explores the capabilities and limitations of Rayo’s formal account of fragmentation in The Construction of Logical Space to model cognitive states of agents with inconsistent beliefs.