pith. machine review for the scientific record. sign in

arxiv: 2604.05246 · v1 · submitted 2026-04-06 · 💻 cs.PL

Recognition: 2 theorem links

· Lean Theorem

A Gradual Probabilistic Lambda Calculus

Authors on Pith no claims yet

Pith reviewed 2026-05-10 18:40 UTC · model grok-4.3

classification 💻 cs.PL
keywords gradual typingprobabilistic programminglambda calculustype safetygradual guaranteeprobabilistic couplingsconservative extension
0
0 comments X

The pith

GPLC lets programmers gradually add or remove type and probability annotations in probabilistic code while preserving type safety.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper defines GPLC as a gradual probabilistic lambda calculus that includes a binary choice operator and supports optional static annotations on both types and probabilities. It gives the language a static semantics that uses probabilistic couplings to relate imprecise types and a dynamic semantics obtained by elaboration to TPLC, whose programs denote probability distributions. The metatheory shows that the resulting system is type safe, that it is a conservative extension of a fully static probabilistic calculus, and that it satisfies the gradual guarantee by changing behavior continuously when type precision is increased or decreased.

Core claim

GPLC is a gradual source probabilistic lambda calculus whose static semantics relies on probabilistic couplings to define consistency, precision, and consistent transitivity, while its dynamic semantics is given by elaboration to TPLC; TPLC interprets programs as distributions over final values, and both languages are shown to be type safe, to conservatively extend their fully static variants, and to satisfy the gradual guarantee with respect to type precision.

What carries the argument

Probabilistic couplings that define consistency, precision, and consistent transitivity between gradual types in the static semantics of GPLC.

If this is right

  • Any GPLC program can be made fully static or fully dynamic by adjusting annotations and the resulting behavior remains type safe.
  • Increasing the precision of any type or probability annotation produces a program whose distribution is related to the original by the gradual guarantee.
  • TPLC programs obtained from elaboration are always well-typed and terminate with a proper probability distribution over values.
  • The language supports a spectrum of checking styles from fully static to fully dynamic without separate implementations.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The coupling technique used for gradual relations might transfer to other calculi that track distributions or expectations.
  • A practical implementation could allow developers to start with dynamic probabilistic scripts and incrementally add type information while re-checking safety.
  • The gradual guarantee implies that small annotation changes cannot suddenly alter the set of possible runtime errors or the support of the output distribution.

Load-bearing premise

The static relations between gradual types can be defined and shown to be transitive using probabilistic couplings between the distributions they denote.

What would settle it

A concrete GPLC program and a pair of type annotations differing only in precision such that the elaborated TPLC programs produce observably different output distributions.

Figures

Figures reproduced from arXiv: 2604.05246 by Federico Olmedo, Mat\'ias Toro, Wenjia Ye.

Figure 1
Figure 1. Figure 1: Syntax of SPLC. Tracking dependencies of probability annotations When dealing with unknown probabilities, as in type {Bool 9 10 , Bool?, Error503?}, the gradual language must ensure that the concrete probabilities they represent induce only well-defined static distribution types, with a total probability of 1. This requirement induces implicit dependencies and gradual probabilities are thus elaborated to f… view at source ↗
Figure 2
Figure 2. Figure 2: Type system of SPLC. type equality is given by rules: Real =𝑠 Real Bool =𝑠 Bool 𝜏1 =𝑠 𝜏2 𝑇1 =𝑠 𝑇2 𝜏1 → 𝑇1 =𝑠 𝜏2 → 𝑇2 ∀𝜏 ∈ supp(𝑇1 ) ∪ supp(𝑇2 ). 𝑇1 (𝜏) = 𝑇2 (𝜏) 𝑇1 =𝑠 𝑇2 where supp(𝑇 ) represents the support of distribution type 𝑇 defined by supp(𝑇 ) = {𝜏 | 𝜏 𝑝 ∈ 𝑇 ∧ 𝑝 > 0} and 𝑇 (𝜏) represents the probability that 𝑇 assigns to a simple type 𝜏, defined by {{ 𝜏 𝑝𝑖 𝑖 | 𝑖 ∈ I }}(𝜏) = Í 𝑖∈I| 𝜏𝑖 =𝑠 𝜏 𝑝𝑖 . Follo… view at source ↗
Figure 3
Figure 3. Figure 3: Runtime semantics of SPLC. by first scaling 𝑇1 by 𝑝 and 𝑇2 by 1 − 𝑝, and then adding the resulting scaled distribution types together. Scaling 𝑝 · 𝑇 is defined pointwise, i.e. by scaling all the probabilities in the distribution type by 𝑝; the addition of two (sub) distribution types 𝑇1 + 𝑇2 is defined as the union of the two multi-sets, provided that the sum of the resulting probabilities does not exceed … view at source ↗
Figure 4
Figure 4. Figure 4: Syntax of GPLC. branches and then returns the weighted sum of the so-obtained distribution values. The scaling and addition operators for distribution values are defined analogously to those for type distributions ( [PITH_FULL_IMAGE:figures/full_fig_p012_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Type system of GPLC. 𝜌 · {{𝜎 𝜌i 𝑖 | 𝑖 ∈ I }} = {{𝜎 𝜌·𝜌i 𝑖 | 𝑖 ∈ I }}. The lifting of the sum between distribution types, also denoted by +, coincides with the original operation (see [PITH_FULL_IMAGE:figures/full_fig_p014_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Probability splitting to justify consistency between gradual distribution types. [PITH_FULL_IMAGE:figures/full_fig_p015_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Formula types. Type consistency via symbolic liftings To represent unknown probabilities as (free) variables in the set of formulas defining the lifting of consistency (from gradual simple types to gradual distribution types), we extend the syntax of GPLC with formula simple types (FSType) and formula distribution types (FDType) as shown in [PITH_FULL_IMAGE:figures/full_fig_p017_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Type precision in GPLC. static language (and vice versa). To establish the first property, the static gradual guarantee for GPLC, we first need to define a notion of precision between types, and subsequently between terms. Type precision AGT casts the definition of type precision in terms of set containment on the concretization of the gradual types, i.e. 𝐺1 ⊑ 𝐺2 (meaning that gradual type 𝐺1 is at least a… view at source ↗
Figure 9
Figure 9. Figure 9: Term precision in GPLC. • {{Real1 }} ⊑ {{? 1 }} because 𝜔11 = 1 is satisfiable by the solution set {𝜔11 = 1}. □ As already hinted, this inductive definition of precision is equivalent to Definition 6: Lemma 6 (Equivalence of type precision). For any pair of gradual simple types 𝜎, 𝛿 ∈ GType and any pair of gradual distribution types 𝜇, 𝜈 ∈ GDType, (1) 𝜎 ⊑𝐴𝐺𝑇 𝛿 iff 𝜎 ⊑ 𝛿. (2) 𝜇 ⊑𝐴𝐺𝑇 𝜈 iff 𝜇 ⊑ 𝜈. Term precis… view at source ↗
Figure 10
Figure 10. Figure 10: Syntax of TPLC (excerpt). ensuring at runtime that no static assumptions are violated. If a static assumption is violated, then a runtime error is raised. This cast calculus is usually called the gradual target language. The dynamic semantics of GPLC is no exception: taking inspiration from AGT, we elaborate GPLC into an evidence-based gradual target language, where evidence plays the role of casts that j… view at source ↗
Figure 11
Figure 11. Figure 11: Type system of TPLC. be either a redex (error𝜇 ) or a value (error𝜎 ). The main reason for this is to simplify the metatheory when accounting for probabilistic branches that may fail during runtime. Errors also carry type information related to the expected type of the expression in order to establish type safety, and can be removed in a real implementation. Finally, a distribution value V stands for a di… view at source ↗
Figure 12
Figure 12. Figure 12: Distribution semantics of TPLC (part 1). [PITH_FULL_IMAGE:figures/full_fig_p027_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Distribution semantics of TPLC (part 2). [PITH_FULL_IMAGE:figures/full_fig_p028_13.png] view at source ↗
Figure 14
Figure 14. Figure 14: Illustration of types and evidence for reduction example. [PITH_FULL_IMAGE:figures/full_fig_p030_14.png] view at source ↗
Figure 15
Figure 15. Figure 15: Elaboration from GPLC to TPLC. 𝜉2 = ⌈{{Real1 }}⌉ = (𝜔 ′ 2 = 1) ⊲ {{Real𝜔′ 2 }} for 𝜔 ′ 1 and 𝜔 ′ 2 fresh. We know that Φ1 = ⌈ 1 2 ⌉ 𝜔1 = (𝜔1 = 1 2 ) [PITH_FULL_IMAGE:figures/full_fig_p031_15.png] view at source ↗
Figure 16
Figure 16. Figure 16: Term precision of TPLC. Type Safety Following [38], we model errors (error𝜎 and error𝜇 ) as expressions, thereby simplifying the statement of type safety, as we do not need to reason about error separately. Type safety for GPLC then states that if a term m is well-typed, then it either reduces to a distribution value of an equivalent type, or diverges: Theorem 4 (Type safety for GPLC). For every term m an… view at source ↗
Figure 17
Figure 17. Figure 17: Logical relation between SPLC and TPLC. choice with a static probability 1 2 is more precise than the one that we obtain replacing the static probability with ?. The rule application would then generate, as premise, the entailment (𝜔1 = 1 2 ∧ 𝜔2 = 1 2 ∧ 𝜔1 + 𝜔2 = 1) ⇒ (𝜔1 ∈ [0, 1] ∧ 𝜔2 ∈ [0, 1] ∧ 𝜔1 + 𝜔2 = 1), with 𝜔1, 𝜔1 universally quantified. The remaining rules are standard. Having defined the notion … view at source ↗
Figure 18
Figure 18. Figure 18: SPLC. A The Static Language SPLC This section presents the type well-formedness definition (Definition 11), complete rules and proofs etc of SPLC. The static semantics of SPLC is shown in [PITH_FULL_IMAGE:figures/full_fig_p044_18.png] view at source ↗
Figure 19
Figure 19. Figure 19: SPLC: Distribution semantics A.1 Type System Definition 11 (Well-formedness of types). ⊢ Real ⊢ Bool ⊢ 𝜏 ⊢ 𝑇 ⊢ 𝜏 → 𝑇 Í 𝑖∈I 𝑝𝑖 = 1 ∀𝑖 ∈ I, ⊢ 𝜏𝑖 ⊢ {{ 𝜏 𝑝𝑖 𝑖 | 𝑖 ∈ I }} Definition 12 (Well-formedness of contexts). ⊢ · ⊢ 𝜏 ⊢ Γ, 𝑥 : 𝜏 Lemma 14 (Well-formedness (equality)). (1) If ⊢ 𝜏1 and 𝜏1 = 𝜏2 then ⊢ 𝜏2 . (2) If ⊢ 𝑇1 and 𝑇1 = 𝑇2 then ⊢ 𝑇2 . Proof. (1) This case is trivial by the induction hypothesis. (2) Su… view at source ↗
Figure 20
Figure 20. Figure 20: Logical relation between SPLC and TPLC. = Φ(𝑖 𝑗) 𝑘 ⊲ {{𝜎 𝜔(𝑖 𝑗) 𝑘 (𝑖 𝑗) 𝑘 | 𝜎𝑖 𝑗 ⊓ 𝜎𝑘 is defined }} ∵ 𝜉 2 ◦ 𝜉 3 = Φ𝑗 ⊲ {{𝜎 𝜚𝑗 𝑗 }} ⊓ Φ𝑘 ⊲ {{𝜎 𝜚𝑘 𝑘 }} ∵ 𝜉 1 ◦ (𝜉 2 ◦ 𝜉 3 ) = Φ𝑖 ⊲ {{𝜎 𝜚𝑖 𝑖 }} ⊓ Φ𝑗𝑘 ⊲ {{𝜎 𝜚𝑗𝑘 𝑗𝑘 | 𝜎𝑗 ⊓ 𝜎𝑘 is defined}} = Φ𝑖( 𝑗𝑘) ⊲ {{𝜎 𝜔𝑖( 𝑗𝑘) 𝑖( 𝑗𝑘) | 𝜎𝑖 ⊓ 𝜎𝑗𝑘is defined }} By the induction hypothesis, (𝜎𝑖 ◦ 𝜎𝑗 ) ◦ 𝜎𝑘 = 𝜎𝑖 ◦ (𝜎𝑗 ◦ 𝜎𝑘 ) ∴ we need to show: Φ𝑖( 𝑗𝑘) ⊲ {{𝜎 𝜔𝑖( 𝑗𝑘) 𝑖( 𝑗𝑘) }} r = Φ(𝑖… view at source ↗
Figure 21
Figure 21. Figure 21: Type system of GPLC. = 𝜚 𝑗 ∴ Í 𝑗 𝜔𝑗𝑘 ∵ Í 𝑗 𝜚 𝑗 = 1 = 𝜚𝑘 ∴ Í 𝑘 𝜔𝑖𝑘 = Í 𝑘 ( Í 𝑗 𝜔𝑘) · 𝜔𝑘 ∵ Í 𝑗 𝜔𝑘 = 𝜚𝑖 ∴ = 𝜚𝑖 ∴ Í 𝑖 𝜔𝑖𝑘 ∵ Í 𝑖 𝜚𝑖 = 1 = 𝜚𝑖 The result holds. □ Lemma 35. ⌈𝜌1 · 𝜌2⌉ 𝜔 ⇐⇒ ⌈𝜌1⌉ 𝜔1 · ⌈𝜌2⌉ 𝜔2 [PITH_FULL_IMAGE:figures/full_fig_p067_21.png] view at source ↗
Figure 22
Figure 22. Figure 22: Precision of GPLC. Proof. ∵ 𝜔 = 𝜔1 · 𝜔2, The result holds. □ Lemma 36. ⌈𝜌 · 𝜇⌉ ⇐⇒ ⌈𝜌⌉ 𝜔 · ⌈𝜇⌉. Proof. Suppose 𝜇 = {{𝜎 𝜌i 𝑖 | 𝑖 ∈ I }}. ∵ 𝜌 · 𝜇 = {{𝜎 𝜌·𝜌i 𝑖 | 𝑖 ∈ I }} ∴ ⌈{{𝜎 𝜌·𝜌i 𝑖 | 𝑖 ∈ I }}⌉ = Ó 𝑖∈I ⌈𝜌 · 𝜌i⌉ 𝜔𝑖 ∧ Í 𝑖∈I 𝜔𝑖 = 1 ⊲ {{ ⌈𝜎𝑖 ⌉ 𝜔𝑖 | 𝑖 ∈ I }} = Ó 𝑖∈I ⌈𝜌⌉ 𝜔 ∧ ⌈𝜌i⌉ 𝜔′ 𝑖 ∧ 𝜔𝑖 = 𝜔 ′ 𝑖 · 𝜔 ∧ Í 𝑖∈I 𝜔𝑖 = 1 ⊲ {{ ⌈𝜎𝑖 ⌉ 𝜔𝑖 | 𝑖 ∈ I }} ∵ ⌈𝜌⌉ 𝜔 · ⌈𝜇⌉ = ⌈𝜌⌉ 𝜔 · ⌈{{𝜎 𝜌i 𝑖 | 𝑖 ∈ I }}⌉ = Ó 𝑖∈I ⌈𝜌i⌉ 𝜔′ 𝑖 ∧ ⌈𝜌⌉ 𝜔 … view at source ↗
Figure 23
Figure 23. Figure 23: TPLC: Type system [PITH_FULL_IMAGE:figures/full_fig_p108_23.png] view at source ↗
Figure 24
Figure 24. Figure 24: TPLC: Distribution Semantics [PITH_FULL_IMAGE:figures/full_fig_p109_24.png] view at source ↗
Figure 25
Figure 25. Figure 25: Elaboration from GPLC to TPLC [PITH_FULL_IMAGE:figures/full_fig_p110_25.png] view at source ↗
Figure 26
Figure 26. Figure 26: Precision of TPLC [PITH_FULL_IMAGE:figures/full_fig_p111_26.png] view at source ↗
read the original abstract

Probabilistic programming languages have recently gained a lot of attention, in particular due to their applications in domains such as machine learning and differential privacy. To establish invariants of interest, many such languages include some form of static checking in the form of type systems. However, adopting such a type discipline can be cumbersome or overly conservative. Gradual typing addresses this problem by supporting a smooth transition between static and dynamic checking, and has been successfully applied for languages with different constructs and type abstractions. Nevertheless, its benefits have never been explored in the context of probabilistic languages. In this work, we present and formalize GPLC, a gradual source probabilistic lambda calculus. GPLC includes a binary probabilistic choice operator and allows programmers to gradually introduce/remove static type -- and probability -- annotations. The static semantics of GPLC heavily relies on the notion of probabilistic couplings, as required for defining several relations, such as consistency, precision, and consistent transitivity. The dynamic semantics of GPLC is given via elaboration to the target language TPLC, which features a distribution-based semantics interpreting programs as probability distributions over final values. Regarding the language metatheory, we establish that TPLC -- and therefore also GPLC -- is type safe and satisfies two of the so-called refined criteria for gradual languages, namely, that it is a conservative extension of a fully static variant and that it satisfies the gradual guarantee, behaving smoothly with respect to type precision.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 3 minor

Summary. The manuscript presents GPLC, a gradual source probabilistic lambda calculus featuring a binary probabilistic choice operator and support for gradually introducing or removing static type and probability annotations. Static semantics relies on probabilistic couplings to define relations including consistency, precision, and consistent transitivity. GPLC elaborates to TPLC, whose dynamic semantics interprets programs as probability distributions over values. The metatheory claims type safety for TPLC (hence GPLC via elaboration), that GPLC is a conservative extension of a fully static variant, and that it satisfies the gradual guarantee with respect to type precision.

Significance. If the claims hold, this is a significant contribution as the first formalization of gradual typing for probabilistic languages, an area relevant to machine learning and differential privacy where static checking can be cumbersome. The use of couplings to handle probabilistic relations in the gradual criteria is a technically sound device that enables the conservative extension and gradual guarantee results. The paper provides explicit definitions and stated proofs of type safety, conservative extension, and the gradual guarantee, which strengthens the assessment.

major comments (1)
  1. §4 (Metatheory, proofs of gradual guarantee): the claim that consistent transitivity holds via couplings needs an explicit statement of the conditions under which the coupling-based relation is transitive for arbitrary probability distributions; without this, the gradual guarantee may not apply to all well-typed programs.
minor comments (3)
  1. Abstract: the reference to 'refined criteria for gradual languages' should include a citation to the originating work (e.g., Siek et al. or follow-up papers on refined gradual typing criteria).
  2. Syntax and static semantics sections: the notation distinguishing probability annotations from type annotations in the gradual rules should be made more explicit, perhaps via a dedicated figure or table of examples.
  3. Related work: add a brief comparison to existing gradual calculi with effects or non-determinism to clarify the novelty of the coupling approach.

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for their positive evaluation of the manuscript and for the constructive comment on the metatheory section. We address the point below and will incorporate a clarification in the revised version.

read point-by-point responses
  1. Referee: §4 (Metatheory, proofs of gradual guarantee): the claim that consistent transitivity holds via couplings needs an explicit statement of the conditions under which the coupling-based relation is transitive for arbitrary probability distributions; without this, the gradual guarantee may not apply to all well-typed programs.

    Authors: We appreciate the referee's suggestion for strengthening the presentation. The manuscript defines the coupling-based consistency relation (Definition 4.3) and proves consistent transitivity (Lemma 4.8) as a step toward the gradual guarantee (Theorem 4.13). The transitivity proof relies on the existence of suitable couplings that preserve the marginal distributions and the precision ordering on types and probabilities. We agree that stating the precise conditions more explicitly—namely, that the relation is transitive whenever there exist couplings whose supports satisfy the consistency and precision constraints for the underlying distributions—would improve readability and make the scope of the result clearer. In the revised manuscript we will add a short clarifying lemma immediately before Theorem 4.13 that enumerates these conditions. This is a presentational improvement only; the existing proofs already establish the result for all well-typed programs in GPLC. revision: yes

Circularity Check

0 steps flagged

No significant circularity in derivation chain

full rationale

The paper defines GPLC via elaboration to TPLC and proves type safety, conservative extension, and the gradual guarantee using new relations (consistency, precision, consistent transitivity) built on the external notion of probabilistic couplings. These relations and the subsequent metatheory proofs do not reduce to self-definitions, fitted parameters renamed as predictions, or load-bearing self-citations; the central claims rest on standard gradual-typing criteria and independently verifiable coupling properties rather than circular reductions to the paper's own inputs.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 2 invented entities

The central claim rests on standard mathematical notions of probability distributions and couplings plus gradual typing concepts; no numeric free parameters are introduced, and the new language constructs are the primary addition.

axioms (2)
  • domain assumption Probabilistic couplings can define consistency, precision, and consistent transitivity relations for gradual typing over probabilistic choice.
    Explicitly stated as the basis for static semantics in the abstract.
  • domain assumption Elaboration to a distribution-based target language preserves type safety and gradual properties.
    Core of the dynamic semantics and metatheory claims.
invented entities (2)
  • GPLC no independent evidence
    purpose: Gradual source probabilistic lambda calculus with binary choice and mixed annotations
    New language defined in the paper; no independent evidence outside this work.
  • TPLC no independent evidence
    purpose: Target language with distribution-based semantics for elaboration
    Introduced as the elaboration target; no independent evidence.

pith-pipeline@v0.9.0 · 5550 in / 1400 out tokens · 42554 ms · 2026-05-10T18:40:24.009544+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

92 extracted references · 20 canonical work pages

  1. [1]

    Martin Avanzini, Ugo Dal Lago, and Alexis Ghyselen. 2021. Type-Based Complexity Analysis of Probabilistic Functional Programs. InProceedings of the 34th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS ’19). IEEE Press, Article 41, 13 pages

  2. [2]

    Arthur Azevedo de Amorim, Matt Fredrikson, and Limin Jia. 2020. Reconciling Noninterference and Gradual Typing. InProceedings of the 2020 Symposium on Logic in Computer Science (LICS 2020)

  3. [3]

    Felipe Ba˜nados Schwerter, Ronald Garcia, and ´Eric Tanter. 2016. Gradual Type-and-Effect Systems.Journal of Functional Programming26 (Sept. 2016), 19:1–19:69

  4. [4]

    Gilles Barthe, Benjamin Gr´egoire, and Santiago Zanella-B´eguelin. 2009. Formal Certification of Code-Based Cryptographic Proofs. In36th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL’09). ACM, New York, 90–101

  5. [5]

    12 Johannes Borgström, Ugo Dal Lago, Andrew D

    Johannes Borgstr¨om, Ugo Dal Lago, Andrew D. Gordon, and Marcin Szymczak. 2016. A lambda-calculus foundation for universal probabilistic programming. InProceedings of the 21st ACM SIGPLAN International Conference on Functional Programming, ICFP 2016, Nara, Japan, September 18-22, 2016, Jacques Garrigue, Gabriele Keller, and Eijiro Sumii (Eds.). ACM, 33–46...

  6. [6]

    Brett Boston, Adrian Sampson, Dan Grossman, and Luis Ceze. 2015. Probability type inference for flexible approximate programming. InProceedings of the 2015 ACM SIGPLAN International Conference on Object- Oriented Programming, Systems, Languages, and Applications, OOPSLA 2015, part of SPLASH 2015, Pittsburgh, PA, USA, October 25-30, 2015, Jonathan Aldrich ...

  7. [7]

    Michael Carbin, Sasa Misailovic, and Martin C. Rinard. 2016. Verifying quantitative reliability for programs that execute on unreliable hardware.Commun. ACM59, 8 (2016), 83–91. https://doi.org/10.1145/2958738

  8. [8]

    Giuseppe Castagna and Victor Lanvin. 2017. Gradual Typing with Union and Intersection Types.Proceedings of the ACM on Programming Languages1, ICFP (Sept. 2017), 41:1–41:28

  9. [9]

    Giuseppe Castagna, Victor Lanvin, Tommaso Petrucciani, and Jeremy G. Siek. 2019. Gradual typing: a new perspective. See[48], 16:1–16:32. A Gradual Probabilistic Lambda Calculus 41

  10. [10]

    Rajamani, Aditya V

    Guillaume Claret, Sriram K. Rajamani, Aditya V. Nori, Andrew D. Gordon, and Johannes Borgstr¨om. 2013. Bayesian Inference using Data Flow Analysis. InProceedings of the 9th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2013). ACM, 92–102

  11. [11]

    Vincent Danos and Thomas Ehrhard. 2011. Probabilistic coherence spaces as a model of higher-order probabilistic computation.Information and Computation209, 6 (2011), 966–991. https://doi.org/10.1016/j. ic.2011.02.001

  12. [12]

    Yuxin Deng and Wenjie Du. 2011. Logical, Metric, and Algorithmic Characterisations of Probabilistic Bisimulation.CoRRabs/1103.4577 (2011). arXiv:1103.4577 http://arxiv.org/abs/1103.4577

  13. [13]

    Tim Disney and Cormac Flanagan. 2011. Gradual information flow typing. InInternational Workshop on Scripts to Programs

  14. [14]

    Cynthia Dwork and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy.Found. Trends Theor. Comput. Sci.9, 3-4 (2014), 211–407. https://doi.org/10.1561/0400000042

  15. [15]

    Thomas Ehrhard, Michele Pagani, and Christine Tasson. 2017. Measurable Cones and Stable, Measurable Functions: A Model for Probabilistic Higher-Order Programming.Proc. ACM Program. Lang.2, POPL, Article 59 (dec 2017), 28 pages. https://doi.org/10.1145/3158147

  16. [16]

    Luminous Fennell and Peter Thiemann. 2013. Gradual Security Typing with References. InProceedings of the 26th Computer Security Foundations Symposium (CSF). 224–239

  17. [17]

    Ronald Garcia. 2013. Calculating threesomes, with blame. InProceedings of the 18th ACM SIGPLAN International Conference on Functional programming. 417–428

  18. [18]

    Clark, and ´Eric Tanter

    Ronald Garcia, Alison M. Clark, and ´Eric Tanter. 2016. Abstracting Gradual Typing. InProceedings of the 43rd ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL 2016), Rastislav Bod´ık and Rupak Majumdar (Eds.). ACM Press, St Petersburg, FL, USA, 429–442. See erratum: https://www.cs.ubc.ca/ rxg/agt-erratum.pdf

  19. [19]

    Zoubin Ghahramani. 2015. Probabilistic Machine Learning and Artificial Intelligence.Nature521, 7553 (2015), 452–459

  20. [20]

    Shafi Goldwasser and Silvio Micali. 1984. Probabilistic Encryption.J. Comput. Sys. Sci.28, 2 (1984), 270–299

  21. [21]

    Goodman, Vikash K

    Noah D. Goodman, Vikash K. Mansinghka, Daniel Roy, Keith Bonawitz, and Joshua B. Tenenbaum. 2008. Church: A Language for Generative Models. InProceedings of the Twenty-Fourth Conference on Uncertainty in Artificial Intelligence (UAI’08). AUAI Press, 220–229

  22. [22]

    Noah D Goodman and Andreas Stuhlm¨ uller. 2014. The Design and Implementation of Probabilistic Programming Languages. http://dippl.org. Accessed: 2022-10-17

  23. [23]

    Gordon, Thomas A

    Andrew D. Gordon, Thomas A. Henzinger, Aditya V. Nori, and Sriram K. Rajamani. 2014. Probabilistic programming. InProceedings of the on Future of Software Engineering, FOSE 2014. ACM, 167–181

  24. [24]

    Momoko Hattori, Naoki Kobayashi, and Ryosuke Sato. 2023. Gradual Tensor Shape Checking. InProgramming Languages and Systems: 32nd European Symposium on Programming, ESOP 2023, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2023, Paris, France, April 22–27, 2023, Proceedings. Springer-Verlag, Berlin, Heidelberg, 19...

  25. [25]

    Willem Heijltjes and Georgina Majury. 2025. Simple Types for Probabilistic Termination. In33rd EACSL Annual Conference on Computer Science Logic, CSL 2025, February 10-14, 2025, Amsterdam, Netherlands (LIPIcs), J ¨org Endrullis and Sylvain Schmitz (Eds.), Vol. 326. Schloss Dagstuhl - Leibniz-Zentrum f¨ ur Informatik, 31:1–31:21. https://doi.org/10.4230/LI...

  26. [26]

    David Herman, Aaron Tomb, and Cormac Flanagan. 2007. Space-efficient gradual typing. InIn Trends in Functional Programming (TFP

  27. [27]

    David Herman, Aaron Tomb, and Cormac Flanagan. 2010. Space-efficient gradual typing.Higher-Order and Sympolic Computation23, 2 (June 2010), 167–189

  28. [28]

    Chris Heunen, Ohad Kammar, Sam Staton, and Hongseok Yang. 2017. A convenient category for higher-order probability theory. In32nd Annual ACM/IEEE Symposium on Logic in Computer Science, LICS 2017, Reykjavik, Iceland, June 20-23, 2017. IEEE Computer Society, 1–12. https://doi.org/10.1109/LICS.2017.8005137

  29. [29]

    Xuejing Huang and Bruno C. d. S. Oliveira. 2020. A Type-Directed Operational Semantics For a Calculus with a Merge Operator. InECOOP

  30. [30]

    Jafery and Jana Dunfield

    Khurram A. Jafery and Jana Dunfield. 2017. Sums of Uncertainty: Refinements Go Gradual, See [ 47], 804–817. 42 Wenjia Ye, Mat ´ıas Toro, and Federico Olmedo

  31. [31]

    Jones and Gordon D

    C. Jones and Gordon D. Plotkin. 1989. A probabilistic powerdomain of evaluations.[1989] Proceedings. Fourth Annual Symposium on Logic in Computer Science(1989), 186–195

  32. [32]

    Oleg Kiselyov. 2016. Probabilistic Programming Language and its Incremental Evaluation. InProgramming Languages and Systems - 14th Asian Symposium, APLAS 2016, Hanoi, Vietnam, November 21-23, 2016, Proceedings (Lecture Notes in Computer Science), Atsushi Igarashi (Ed.), Vol. 10017. 357–376. https: //doi.org/10.1007/978-3-319-47958-3 19

  33. [33]

    Ugo Dal Lago and Charles Grellois. 2017. Probabilistic Termination by Monadic Affine Sized Typing.ACM Transactions on Programming Languages and Systems (TOPLAS)41 (2017), 1 – 65

  34. [34]

    Ugo Dal Lago and Margherita Zorzi. 2012. Probabilistic operational semantics for the lambda calculus.RAIRO Theor. Informatics Appl.46, 3 (2012), 413–450. https://doi.org/10.1051/ita/2012012

  35. [35]

    Ugo Dal Lago and Margherita Zorzi. 2012. Probabilistic operational semantics for the lambda calculus.ArXiv abs/1104.0195 (2012)

  36. [36]

    Tuan Anh Le, Atilim Gunes Baydin, and Frank D. Wood. 2017. Inference Compilation and Universal Probabilistic Programming. InProceedings of the 20th International Conference on Artificial Intelligence and Statistics, AISTATS 2017, 20-22 April 2017, Fort Lauderdale, FL, USA (Proceedings of Machine Learning Research), Aarti Singh and Xiaojin (Jerry) Zhu (Eds...

  37. [37]

    Nico Lehmann and ´Eric Tanter. 2017. Gradual Refinement Types, See [47], 775–788

  38. [38]

    Meven Lennon-Bertrand, Kenji Maillard, Nicolas Tabareau, and ´Eric Tanter. 2022. Gradualizing the Calculus of Inductive Constructions.ACM Transactions on Programming Languages and Systems(2022). To appear. To be presented at POPL ’22

  39. [39]

    Luke Ong, Hugo Paquet, and Dominik Wagner

    Carol Mak, C.-H. Luke Ong, Hugo Paquet, and Dominik Wagner. 2021. Densities of Almost Surely Terminating Probabilistic Programs are Differentiable Almost Everywhere. InProgramming Languages and Systems - 30th European Symposium on Programming, ESOP 2021, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2021, Luxembo...

  40. [40]

    Stefan Malewski, Michael Greenberg, and ´Eric Tanter. 2021. Gradually Structured Data.Proceedings of the ACM on Programming Languages5, OOPSLA (Nov. 2021), 126:1–126:28

  41. [41]

    Zeina Migeed, James Reed, Jason Ansel, and Jens Palsberg. 2024. Generalizing Shape Analysis with Gradual Types. In38th European Conference on Object-Oriented Programming (ECOOP 2024) (Leibniz International Proceedings in Informatics (LIPIcs)), Jonathan Aldrich and Guido Salvaneschi (Eds.), Vol. 313. Schloss Dagstuhl – Leibniz-Zentrum f¨ ur Informatik, Dag...

  42. [42]

    Sparsh Mittal. 2016. A Survey of Techniques for Approximate Computing.ACM Comput. Surv.48, 4 (2016), 62:1–62:33. https://doi.org/10.1145/2893356

  43. [43]

    1995.Randomized Algorithms

    Rajeev Motwani and Prabhakar Raghavan. 1995.Randomized Algorithms. Cambridge University Press

  44. [44]

    New, Daniel R

    Max S. New, Daniel R. Licata, and Amal Ahmed. 2019. Gradual Type Theory. See[48], 15:1–15:31

  45. [45]

    Avi Pfeffer. 2010. Practical Probabilistic Programming. InInductive Logic Programming - 20th International Conference, ILP 2010, Florence, Italy, June 27-30, 2010. Revised Papers (Lecture Notes in Computer Science), Paolo Frasconi and Francesca A. Lisi (Eds.), Vol. 6489. Springer, 2–3

  46. [46]

    Luna Phipps-Costin, Carolyn Jane Anderson, Michael Greenberg, and Arjun Guha. 2021. Solver-based gradual type migration.Proc. ACM Program. Lang.5, OOPSLA (2021), 1–27. https://doi.org/10.1145/3485488

  47. [47]

    ACM Press, Paris, France

    POPL 2017 2017.Proceedings of the 44th ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL 2017). ACM Press, Paris, France

  48. [48]

    Norman Ramsey and Avi Pfeffer. 2002. Stochastic lambda calculus and monads of probability distributions. In POPL ’02

  49. [49]

    Jason Reed and Benjamin C. Pierce. 2010. Distance Makes the Types Grow Stronger: A Calculus for Differential Privacy. InProceedings of the 15th ACM SIGPLAN International Conference on Functional Programming (ICFP’10). ACM, 157–168

  50. [50]

    Amr Sabry and Matthias Felleisen. 1993. Reasoning about Programs in Continuation-Passing Style. InLISP AND SYMBOLIC COMPUTATION. 288–298

  51. [51]

    Nasser Saheb-Djahromi. 1978. Probabilistic LCF. InMFCS. A Gradual Probabilistic Lambda Calculus 43

  52. [52]

    Moss, Chris Heunen, and Zoubin Ghahramani

    Adam ´Scibior, Ohad Kammar, Matthijs V´ak´ar, Sam Staton, Hongseok Yang, Yufei Cai, Klaus Ostermann, Sean K. Moss, Chris Heunen, and Zoubin Ghahramani. 2018. Denotational validation of higher-order Bayesian inference.Proc. ACM Program. Lang.2, POPL (2018), 60:1–60:29. https://doi.org/10.1145/3158148

  53. [53]

    Roberto Segala and Nancy A. Lynch. 1995. Probabilistic Simulations for Probabilistic Processes.Nord. J. Comput.2, 2 (1995), 250–273

  54. [54]

    Jeremy Siek and Walid Taha. 2006. Gradual Typing for Functional Languages. InProceedings of the Scheme and Functional Programming Workshop. 81–92

  55. [55]

    Jeremy Siek, Peter Thiemann, and Phil Wadler. 2015. Blame and Coercion: Together Again for the First Time. InProceedings of the 36th ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI 2015). ACM Press, Portland, OR, USA, 425–435

  56. [56]

    Jeremy Siek and Philip Wadler. 2010. Threesomes, with and without blame. InProceedings of the 37th annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages (POPL 2010). ACM Press, Madrid, Spain, 365–376

  57. [57]

    Siek, Ronald Garcia, and Walid Taha

    Jeremy G. Siek, Ronald Garcia, and Walid Taha. 2009. Exploring the Design Space of Higher-Order Casts. In ESOP

  58. [58]

    Siek, Michael M

    Jeremy G. Siek, Michael M. Vitousek, Matteo Cimini, and John Tang Boyland. 2015. Refined Criteria for Gradual Typing. In1st Summit on Advances in Programming Languages (SNAPL 2015) (Leibniz International Proceedings in Informatics (LIPIcs)), Vol. 32. Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik, Asilomar, California, USA, 274–293

  59. [59]

    Siek, Michael M

    Jeremy G. Siek, Michael M. Vitousek, Matteo Cimini, Sam Tobin-Hochstadt, and Ronald Garcia. 2015. Monotonic References for Efficient Gradual Typing. InESOP

  60. [60]

    Stephen Strickland, Christos Dimoulas, Sam Tobin-Hochstadt, and Matthias Felleisen

    Asumu Takikawa, T. Stephen Strickland, Christos Dimoulas, Sam Tobin-Hochstadt, and Matthias Felleisen

  61. [61]

    InProceedings of the 27th ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages and Applications (OOPSLA 2012)

    Gradual Typing for First-Class Classes. InProceedings of the 27th ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages and Applications (OOPSLA 2012). ACM Press, Tucson, AZ, USA, 793–810

  62. [62]

    Mat´ıas Toro, Ronald Garcia, and´Eric Tanter. 2018. Type-Driven Gradual Security with References.ACM Transactions on Programming Languages and Systems40, 4 (Nov. 2018), 16:1–16:55

  63. [63]

    Mat´ıas Toro and´Eric Tanter. 2017. A Gradual Interpretation of Union Types. InProceedings of the 24th Static Analysis Symposium (SAS 2017) (Lecture Notes in Computer Science), Vol. 10422. Springer-Verlag, New York City, NY, USA, 382–404

  64. [64]

    Mat´ıas Toro and´Eric Tanter. 2020. Abstracting Gradual References.Science of Computer Programming197 (Oct. 2020), 1–65

  65. [65]

    Hoffman, Rif A

    Dustin Tran, Matthew D. Hoffman, Rif A. Saurous, Eugene Brevdo, Kevin Murphy, and David M. Blei

  66. [66]

    In5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings

    Deep Probabilistic Programming. In5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net. https: //openreview.net/forum?id=Hy6b4Pqee

  67. [67]

    Jan-Willem van de Meent, Brooks Paige, Hongseok Yang, and Frank Wood. 2018. An Introduction to Probabilistic Programming.CoRRabs/1809.10756 (2018). arXiv:1809.10756 http://arxiv.org/abs/1809.10756

  68. [68]

    Philip Wadler and Robert Bruce Findler. 2009. Well-Typed Programs Can’t Be Blamed. InProceedings of the 18th European Symposium on Programming Languages and Systems (ESOP 2009) (Lecture Notes in Computer Science), Giuseppe Castagna (Ed.), Vol. 5502. Springer-Verlag, York, UK, 1–16

  69. [69]

    Wenjia Ye, Bruno C. d. S. Oliveira, and Xuejing Huang. 2021. Type-Directed Operational Semantics for Gradual Typing. InECOOP

  70. [70]

    Wenjia Ye, Bruno C. d. S. Oliveira, and Mat´ıas Toro. 2024. Merging Gradual Typing.Proc. ACM Program. Lang.8, OOPSLA2, Article 294 (Oct. 2024), 29 pages. https://doi.org/10.1145/3689734 44 Wenjia Ye, Mat ´ıas Toro, and Federico Olmedo 𝑟∈R, 𝑏∈B, 𝑥∈Var, 𝑝∈ [0,1], 𝜏∈Type, 𝑇∈DType 𝜏::=Real|Bool|𝜏→𝑇(simple types) 𝑇::={ {𝜏 𝑝𝑖 𝑖 |𝑖∈I} }(distribution types) 𝑚 , 𝑛...

  71. [71]

    ∈VJ𝜏K.(𝑣 1 𝑣 ′ 1, 𝑣2 𝑣 ′

  72. [72]

    ∈TJ𝑇K} VJ𝑇K={(V,V) |V={ {𝑣 𝑝𝑖 𝑖 |𝑖∈I} },V={ {𝑣 𝑝 𝑗 𝑗 |𝑗∈J} },∃𝜉={ {𝜏 𝜔𝑘 𝑘 |𝑘∈K} }, (𝜉 ,V,V) ∈Atom[𝑇],∀𝜔 𝑘 >0, 𝑖=𝜔 𝑘 .l, 𝑗=𝜔 𝑘 .r.(𝑣 𝑖, 𝑣 𝑗 ) ∈VJ𝜏 𝑘K}) TJ𝑇K={(𝑚 1, 𝑚2) |𝑚 1 ⇓∗ 𝑠 V1 ∧𝑚 2 ⇓∗ V2,(V 1,V 2) ∈VJ𝑇K} GJ·K={∅} GJΓ, 𝑥:𝜏K={𝛾[(𝑣, 𝑣 ′)/𝑥] |𝛾∈GJΓK∧ (𝑣, 𝑣 ′) ∈VJ𝜏K} Atom[𝜏]={(𝑣, 𝑣 ′) | ⊢ 𝑠 𝑣:𝜏∧ ⊢𝑣 ′ :𝜏} Atom[𝑇]={(𝜉 ,V,V) | ⊢ 𝑠 V:𝑇 1 ∧ ⊢V:𝑇 2 ∧𝜉⊢𝑇 1 r =𝑇 2...

  73. [73]

    :: 𝑝·𝑇 1 + (1 −𝑝) ·𝑇 2) ∈TJ𝑝·𝑇 1 + (1 −𝑝) ·𝑇 2K ∵Γ⊢m 1 ≈𝑚 ′ 1 :𝑇 1 ∴𝛾 1 (m1) ⇓ ∗ V1 ∴𝛾 2 (𝑚 ′

  74. [74]

    ⇓ ∗ V2 ∴(V 1,V 2) ∈𝑇 1 ∵Γ⊢m 2 ≈𝑚 ′ 2 :𝑇 2 ∴𝛾 1 (m2) ⇓ ∗ V ′1 ∴𝛾 2 (𝑚 ′

  75. [75]

    IfΓ, 𝑥 : 𝜏⊢m≈𝑚 ′ : 𝑇 , 𝜀⊢𝜏→𝑇∼𝜏→ 𝑇thenΓ⊢𝜆𝑥:𝜏 .m≈𝜀𝜆𝑥:𝜏 .𝑚 ′ ::𝜏→𝑇:𝑇

    ⇓ ∗ V ′ 2 ∴(V ′3,V ′ 4 ) ∈𝑇 2 By Lemma 21 ∴(𝑝·V 1 + (1 −𝑝) ·V ′3, 𝑝·V 2 + (1 −𝑝) ·V ′ 4 :: 𝑝·𝑇 1 + (1 −𝑝) ·𝑇 2) ∈VJ𝑝·𝑇 1 + (1 −𝑝) ·𝑇 2K The result holds.□ Lemma 27 (Compatibility (lambda)). IfΓ, 𝑥 : 𝜏⊢m≈𝑚 ′ : 𝑇 , 𝜀⊢𝜏→𝑇∼𝜏→ 𝑇thenΓ⊢𝜆𝑥:𝜏 .m≈𝜀𝜆𝑥:𝜏 .𝑚 ′ ::𝜏→𝑇:𝑇 . Proof. (for evidence compositions, Lemma 20 is used). we need to show, (𝜆𝑥:𝜏 .𝛾 1 (m), 𝜀𝜆𝑥:𝜏 .𝛾 2 (...

  76. [76]

    ∈VJ𝜏K,(𝜆𝑥:𝜏 .𝛾 1 (m)v 2, 𝜀𝜆𝑥:𝜏 .𝛾 2 (𝑚 ′)::𝜏→𝑇 𝑣 ′

  77. [77]

    IfΓ⊢v≈𝑣 ′ : Bool,Γ⊢m 1 ≈𝑚 ′ 1 : 𝑇 ,Γ⊢m 2 ≈𝑚 ′ 2 : 𝑇 , 𝜀⊢Bool∼BoolthenΓ⊢if v then m 1 else m2 ≈let𝑥=𝜀𝑣 ′ :: Bool in if𝑥then m 1 else m2 : 𝑇

    ∈TJ𝑇K ∵𝜆𝑥:𝜏 .𝛾 1 (m)v 2 ⇓∗ 𝛾1(m) [v2/𝑥] ∵𝜀𝜆𝑥:𝜏 .𝛾 2(𝑚 ′)::𝜏→𝑇 𝑣 ′ 2 ⇓∗ cod(𝜀)𝛾 2 (𝑚 ′) [dom(𝜀0 ◦𝜀)𝑢 2 ::𝜏/𝑥]::𝑇 By Lemma 21 ∴(v 2,dom(𝜀 0 ◦𝜀)𝑢 2 ::𝜏) ∈VJ𝜏K ∵Γ, 𝑥:𝜏⊢m≈𝑚 ′ :𝑇 ∴(𝛾 1 [𝑥/v2] (m), 𝛾2 [𝑥/dom(𝜀 0 ◦𝜀)𝑢 2 ::𝜏] (m)) ∈TJ𝑇K =(𝛾 1 (m) [𝑥/v2], 𝛾2(m) [𝑥/dom(𝜀 0 ◦𝜀)𝑢 2 ::𝜏]) ∈TJ𝑇K ∴𝛾 1 (m) [v2/𝑥] ⇓ ∗ V1 ∴cod(𝜀)𝛾 2 (𝑚 ′) [dom(𝜀0 ◦𝜀)𝑢 2 ::𝜏/𝑥]::𝑇⇓ ∗ cod(𝜀)V...

  78. [78]

    ⇓ ∗ V2 ∴(V 1,V 2) ∈𝑇 ∵Γ⊢m 2 ≈𝑚 ′ 2 :𝑇 ∴𝛾 1 (m2) ⇓ ∗ V3 ∴𝛾 2 (𝑚 ′

  79. [79]

    If∃𝜔(𝑙, 𝑘)

    ⇓ ∗ V4 ∴(V ′3,V ′ 4 ) ∈𝑇 ifb=𝑏=true, ∴(if b then𝛾 1 (m1)else𝛾 1 (m2) ⇓V 1 ∴let𝑥=𝜀 0𝑏::Bool in if𝑥then𝛾 2 (𝑚1)else𝛾 2 (𝑚2) ⇓V 2 ifb=𝑏=false, ∴(if b then𝛾 1 (m1)else𝛾 1 (𝑚2)) ⇓V 3 ∴let𝑥=𝜀 0𝑏::Bool in if𝑥then𝛾 2 (𝑚1)else𝛾 2 (m2) ⇓V 4 The result holds.□ Lemma 29 (Logical composition). If∃𝜔(𝑙, 𝑘). Í 𝑙 𝜔(𝑙, 𝑘)=𝑝 𝑘 ∧Í 𝑘 𝜔(𝑙, 𝑘)=𝑝 𝑙 ∧𝜔(𝑙, 𝑘)> 0 ⇒ (V 𝑙,V 𝑘) ∈VJ𝑇 𝑖...

  80. [80]

    :: Í 𝑖∈I 𝑝𝑖·𝑇𝑖)) ∈TJ Í 𝑖∈I 𝑝𝑖· 𝑇𝑖K ∵Γ⊢m 1 ≈𝑚 ′ 1 :𝑇 1 ∴𝛾 1 (m1) ⇓ ∗ V1 ∴𝛾 2 (𝑚 ′

Showing first 80 references.