pith. machine review for the scientific record. sign in

arxiv: 2605.02369 · v1 · submitted 2026-05-04 · 💻 cs.IR

Recognition: 3 theorem links

· Lean Theorem

Bridging Behavior and Semantics for Time-aware Cross-Domain Sequential Recommendation

Authors on Pith no claims yet

Pith reviewed 2026-05-08 18:17 UTC · model grok-4.3

classification 💻 cs.IR
keywords cross-domain sequential recommendationtime-aware modelingneural ordinary differential equationslarge language modelscounterfactual perturbationbehavioral preference evolutionsemantic preferencesdomain transfer
0
0 comments X

The pith

Modeling continuous-time behaviors with neural ODEs and time-sensitive semantics via LLM counterfactuals improves cross-domain sequential recommendations.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper claims that cross-domain sequential recommenders underperform when they ignore domain-specific differences in how interests decay at the same time intervals and when they treat semantic preferences as unchanging during transfer. It addresses this by evolving behavioral preferences continuously through a neural ordinary differential equation that separates long-term interests from short-term intentions, while generating temporal semantics by discretizing time-interval tokens and using large language models with counterfactual perturbations to make semantics responsive to time. A time-preference guided transfer module then adaptively weights cross-domain information to reduce negative transfer. A sympathetic reader would care because interaction data across domains remains sparse, and static or time-ignorant models miss how users' engagement patterns shift over identical intervals in different contexts.

Core claim

The central claim is that effective time-aware cross-domain sequential recommendation requires jointly modeling domain-specific temporal dynamics in behavior and semantics: a behavioral evolution module decouples long- and short-term preferences and updates them continuously via neural ODE with event-driven jumps, a semantic generator discretizes temporal interval tokens and applies LLM-based counterfactual perturbations to extract time-sensitive semantics, and a guided transfer module uses these time preferences to control adaptation weights and mitigate negative transfer.

What carries the argument

The BST-CDSR framework's behavioral preference evolution module (neural ODE with event-driven updates) and temporal counterfactual-enhanced semantic generator (discretized time tokens plus LLM perturbations).

If this is right

  • Recommendations become sensitive to domain-specific interaction frequencies and interest decay rates even when time intervals are identical.
  • Semantic preferences during cross-domain transfer are no longer treated as static but vary with discretized time signals.
  • Adaptive weighting based on time preferences reduces negative transfer between domains.
  • Overall accuracy rises on sparse cross-domain datasets by capturing both continuous behavioral evolution and time-varying semantics.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The hybrid use of differential equations for behavior and generative models for semantics may generalize to other sequential tasks where user state evolves continuously alongside textual context.
  • If the counterfactual step proves robust, future systems could routinely augment time encodings with language-model perturbations instead of relying on hand-crafted temporal features.
  • The approach suggests that recommendation pipelines will increasingly combine continuous dynamical models with discrete semantic generators to handle mismatched temporal patterns across domains.

Load-bearing premise

Discretizing temporal interval tokens and applying counterfactual perturbations through large language models will produce time-sensitive semantic preferences without introducing artifacts or domain-specific biases.

What would settle it

An ablation study on the same real-world datasets showing that performance drops to baseline levels when the temporal discretization or LLM counterfactual component is removed.

Figures

Figures reproduced from arXiv: 2605.02369 by Chong Zhang, Gangyi Ding, Haoyan Fu, Tianyu Huang, Yidong Li, Zemu Liu, Zhida Qin.

Figure 1
Figure 1. Figure 1: Data Analysis on Behavior and Semantic. (a) Behav view at source ↗
Figure 2
Figure 2. Figure 2: The overall architecture of the proposed method. The model consists of three key components: (i) a time-aware view at source ↗
Figure 3
Figure 3. Figure 3: Performance comparison with baselines across dif view at source ↗
read the original abstract

Cross-domain sequential recommendation (CDSR) alleviates interaction sparsity by jointly modeling user behaviors across multiple domains. While current studies have made some progresses, they still neglect two issues that severely impact recommendation performance: (i) ignoring domain-specific interaction frequencies and interest decay rates at identical time intervals; (ii) treating semantic preferences as time-invariant during cross-domain transfer. To address these, we propose a novel framework that bridges Behavior and Semantics for Time-aware Cross-Domain Sequential Recommendation (BST-CDSR). Specifically, we design a behavioral preference evolution module that decouples long-term interests and short-term intentions, and models continuous-time preference via a neural ordinary differential equation (ODE) with event-driven updates. Additionally, to capture time-aware semantic preferences, we introduce a temporal counterfactual-enhanced semantic generator that discretizes temporal interval tokens and leverages large language models (LLMs) to extract robust temporal semantics, where counterfactual perturbations enhance the time sensitivity of semantic preferences. Furthermore, we propose a time-preference guided domain transfer module to adaptively control transfer weights and mitigate negative transfer. Extensive experiments on real-world datasets demonstrate that BST-CDSR consistently outperforms baselines.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 2 minor

Summary. The paper proposes BST-CDSR, a framework for time-aware cross-domain sequential recommendation that decouples long- and short-term behavioral preferences via a neural ODE with event-driven updates, generates time-aware semantic preferences by discretizing temporal interval tokens and applying LLM-based extraction with counterfactual perturbations, and uses a time-preference guided domain transfer module to adaptively control transfer weights and reduce negative transfer. It claims that this bridging of behavior and semantics yields consistent outperformance over baselines on real-world datasets.

Significance. If the experimental claims hold after proper validation, the work would offer a concrete advance in CDSR by explicitly modeling domain-specific temporal decay in both behavioral trajectories (via ODEs) and semantic representations (via LLM perturbations), addressing two gaps that prior transfer-based methods have left open. The combination of continuous-time dynamics with LLM-augmented semantics is a plausible direction for handling sparse, multi-domain interaction logs.

major comments (3)
  1. [Abstract and §4] Abstract and §4 (Experimental Setup): the central claim of 'consistent outperformance' is stated without any reported details on datasets, baseline implementations, evaluation metrics, statistical significance tests, or ablation configurations. This renders the performance gains unverifiable from the provided text and raises the possibility that reported improvements depend on post-hoc hyper-parameter choices or unstated implementation details.
  2. [§3.2] §3.2 (Temporal Counterfactual-Enhanced Semantic Generator): the assertion that discretizing interval tokens followed by LLM extraction and counterfactual perturbations 'enhance the time sensitivity of semantic preferences' rests on the untested assumption that the generated semantics remain free of LLM-induced artifacts, training-data biases, or inconsistent temporal reasoning. No grounding mechanism, consistency check, or ablation isolating perturbation quality is described, which directly affects the reliability of the subsequent domain-transfer module.
  3. [§3.3] §3.3 (Time-Preference Guided Domain Transfer): the adaptive control of transfer weights is presented as mitigating negative transfer, yet the paper provides no formal analysis or empirical demonstration that the learned weights correlate with the domain-specific frequency and decay rates claimed in the introduction. Without this link, the module's contribution to the overall performance cannot be isolated from the other components.
minor comments (2)
  1. [§3.1] Notation for the neural ODE event-driven updates and the discretization of temporal tokens should be made fully explicit (including the precise form of the ODE right-hand side and the token vocabulary) to allow reproduction.
  2. [Conclusion] The manuscript would benefit from a dedicated limitations paragraph discussing potential failure modes when LLM perturbations are applied to domains with very different temporal granularities.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for the constructive comments that highlight opportunities to improve the clarity and verifiability of our manuscript. We address each major comment point by point below and indicate the revisions we will incorporate.

read point-by-point responses
  1. Referee: [Abstract and §4] Abstract and §4 (Experimental Setup): the central claim of 'consistent outperformance' is stated without any reported details on datasets, baseline implementations, evaluation metrics, statistical significance tests, or ablation configurations. This renders the performance gains unverifiable from the provided text and raises the possibility that reported improvements depend on post-hoc hyper-parameter choices or unstated implementation details.

    Authors: We agree that the abstract is intentionally concise and omits granular experimental details. Section 4 of the manuscript already describes the two real-world datasets (Amazon and Douban), the full set of baselines, the metrics (HR@K and NDCG@K), ablation configurations, and statistical significance testing via paired t-tests with p < 0.05. To make these elements immediately verifiable, we will (i) expand the abstract with a single sentence naming the datasets and metrics, (ii) insert a compact summary table at the start of §4, and (iii) move all hyper-parameter settings and implementation code references to a new appendix subsection. revision: yes

  2. Referee: [§3.2] §3.2 (Temporal Counterfactual-Enhanced Semantic Generator): the assertion that discretizing interval tokens followed by LLM extraction and counterfactual perturbations 'enhance the time sensitivity of semantic preferences' rests on the untested assumption that the generated semantics remain free of LLM-induced artifacts, training-data biases, or inconsistent temporal reasoning. No grounding mechanism, consistency check, or ablation isolating perturbation quality is described, which directly affects the reliability of the subsequent domain-transfer module.

    Authors: The concern about unverified LLM artifacts is valid. While the current manuscript reports an ablation that removes the entire semantic generator, it does not isolate the counterfactual perturbation step nor provide explicit quality checks. In the revision we will add (i) a dedicated paragraph with concrete examples of interval-token discretization and the resulting counterfactual perturbations, (ii) a small-scale human evaluation (three annotators, 100 samples) measuring time-sensitivity and factual consistency, and (iii) a new ablation that toggles only the perturbation component while keeping the rest of the pipeline fixed. revision: yes

  3. Referee: [§3.3] §3.3 (Time-Preference Guided Domain Transfer): the adaptive control of transfer weights is presented as mitigating negative transfer, yet the paper provides no formal analysis or empirical demonstration that the learned weights correlate with the domain-specific frequency and decay rates claimed in the introduction. Without this link, the module's contribution to the overall performance cannot be isolated from the other components.

    Authors: We accept that the manuscript currently lacks an explicit link between the learned transfer weights and the domain-specific frequency/decay statistics mentioned in the introduction. We will add (i) a new figure in §4.4 plotting the average transfer weight per domain against measured interaction frequency and estimated exponential decay rates, (ii) a quantitative correlation analysis (Pearson r) between these quantities, and (iii) a brief formal description of the weight-adaptation objective in the appendix. These additions will allow readers to isolate the module's contribution. revision: yes

Circularity Check

0 steps flagged

No circularity: derivation relies on external ODEs and LLMs without self-referential reduction

full rationale

The BST-CDSR framework introduces three modules (behavioral evolution via neural ODEs with event-driven updates, temporal counterfactual semantic generator using LLM extraction/perturbations on discretized intervals, and time-preference domain transfer) that are presented as independent and grounded in prior external literature on ODEs and LLMs. No equations or claims reduce a prediction or result to a fitted parameter or self-citation by construction; the abstract and described components do not invoke uniqueness theorems, ansatzes, or renamings that collapse to inputs. Experiments on real-world datasets serve as external validation rather than tautological confirmation. This is the standard non-circular case for a modular recommendation architecture.

Axiom & Free-Parameter Ledger

1 free parameters · 2 axioms · 0 invented entities

The framework rests on standard assumptions from sequential recommendation and ODE modeling plus ad-hoc choices for LLM integration and counterfactual generation.

free parameters (1)
  • transfer weights in domain transfer module
    Adaptive control of transfer weights is learned but depends on unspecified hyperparameters for time-preference guidance.
axioms (2)
  • domain assumption User behaviors can be decoupled into long-term interests and short-term intentions that evolve continuously via neural ODE.
    Invoked in the behavioral preference evolution module description.
  • ad hoc to paper LLMs can extract robust temporal semantics from discretized interval tokens when enhanced by counterfactual perturbations.
    Central to the semantic generator module.

pith-pipeline@v0.9.0 · 5514 in / 1286 out tokens · 41517 ms · 2026-05-08T18:17:45.717412+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

67 extracted references · 8 canonical work pages

  1. [1]

    Nawaf Alharbi and Doina Caragea. 2022. Cross-Domain Attentive Sequential Recommendations based on General and Current User Preferences (CD-ASR). In IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology(Melbourne, VIC, Australia)(WI-IAT ’21). Association for Computing Machinery, New York, NY, USA, 48–55

  2. [2]

    Ye Bi, Liqiang Song, Mengqiu Yao, Zhenyu Wu, Jianming Wang, and Jing Xiao

  3. [3]

    DCDIR: A Deep Cross-Domain Recommendation System for Cold Start Users in Insurance Domain. InProceedings of the 43rd International ACM SIGIR Bridging Behavior and Semantics for Time-aware Cross-Domain Sequential Recommendation SIGIR ’26, July 20–24, 2026, Melbourne, VIC, Australia. Conference on Research and Development in Information Retrieval(Virtual Ev...

  4. [4]

    Qingtian Bian, Marcus Vin’icius de Carvalho, Tieying Li, Jiaxing Xu, Hui Fang, and Yiping Ke. 2025. ABXI: Invariant Interest Adaptation for Task-Guided Cross- Domain Sequential Recommendation.Proceedings of the ACM on Web Conference 2025(2025)

  5. [5]

    Jiangxia Cao, Xin Cong, Jiawei Sheng, Tingwen Liu, and Bin Wang. 2022. Con- trastive Cross-Domain Sequential Recommendation. InProceedings of the 31st ACM International Conference on Information & Knowledge Management(Atlanta, GA, USA)(CIKM ’22). Association for Computing Machinery, New York, NY, USA, 138–147

  6. [6]

    Wentao Cheng, Zhida Qin, Zexue Wu, Pengzhan Zhou, and Tianyu Huang. 2025. Large Language Models Enhanced Hyperbolic Space Recommender Systems. arXiv:2504.05694 [cs.IR]

  7. [7]

    Wentao Cheng, Zhida Qin, Zexue Wu, Pengzhan Zhou, and Tianyu Huang. 2025. Large Language Models Enhanced Hyperbolic Space Recommender Systems. InProceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval(Padua, Italy)(SIGIR ’25). Association for Computing Machinery, New York, NY, USA, 1944–1953

  8. [8]

    Haoyan Fu, Zhida Qin, Wenhao Xue, and Gangyi Ding. 2025. Fusing temporal and semantic dependencies for session-based recommendation.Information Processing & Management62, 1 (2025), 103896

  9. [9]

    Haoyan Fu, Zhida Qin, Shixiao Yang, Haoyao Zhang, Bin Lu, Shuang Li, Tianyu Huang, and John C.S. Lui. 2025. Time Matters: Enhancing Sequential Recommen- dations with Time-Guided Graph Neural ODEs. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2(Toronto ON, Canada)(KDD ’25). Association for Computing Machinery, Ne...

  10. [10]

    Lei Guo, Hao Liu, Lei Zhu, Weili Guan, and Zhiyong Cheng. 2023. DA-DAN: A Dual Adversarial Domain Adaption Network for Unsupervised Non-overlapping Cross-domain Recommendation.ACM Trans. Inf. Syst.42, 2, Article 48 (Nov. 2023), 27 pages

  11. [11]

    Lei Guo, Li Tang, Tong Chen, Lei Zhu, Quoc Viet Hung Nguyen, and Hongzhi Yin. 2021. DA-GCN: A Domain-aware Attentive Graph Convolution Network for Shared-account Cross-domain Sequential Recommendation. InProceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, Zhi-Hua Zhou (Ed.). International Joint Conferences on ...

  12. [12]

    Lei Guo, Jinyu Zhang, Tong Chen, Xinhua Wang, and Hongzhi Yin. 2023. Rein- forcement Learning-Enhanced Shared-Account Cross-Domain Sequential Rec- ommendation.IEEE Transactions on Knowledge and Data Engineering35, 7 (2023), 7397–7411

  13. [13]

    Guy Hadad, Haggai Roitman, Yotam Eshel, Bracha Shapira, and Lior Rokach. 2025. X-Cross: Dynamic Integration of Language Models for Cross-Domain Sequential Recommendation. InProceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval(Padua, Italy)(SIGIR ’25). Association for Computing Machinery, New York,...

  14. [14]

    Min Hou, Xin Liu, Le Wu, Chenyi He, Hao Liu, Zhi Li, Xin Li, and Si Wei

  15. [15]

    arXiv:2510.26546 [cs.IR]

    WeaveRec: An LLM-Based Cross-Domain Sequential Recommendation Framework with Model Merging. arXiv:2510.26546 [cs.IR]

  16. [16]

    Duygu Sezen Islakoglu and Jan-Christoph Kalo. 2025. ChronoSense: Exploring Temporal Understanding in Large Language Models with Time Intervals of Events. InProceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Association for Computational Linguistics, Vienna, Austria, 590–602

  17. [17]

    Clark Mingxuan Ju, Leonardo Neves, Bhuvesh Kumar, Liam Collins, Tong Zhao, Yuwei Qiu, Qing Dou, Sohail Nizam, Sen Yang, and Neil Shah. 2025. Revisiting Self-attention for Cross-domain Sequential Recommendation. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2 (Toronto ON, Canada)(KDD ’25). Association for Computin...

  18. [18]

    Wang-Cheng Kang and Julian McAuley. 2018. Self-Attentive Sequential Rec- ommendation. In2018 IEEE International Conference on Data Mining (ICDM). 197–206

  19. [19]

    Jongho Kim and Seung-won Hwang. 2025. Counterfactual-Consistency Prompt- ing for Relative Temporal Understanding in Large Language Models. InProceed- ings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Association for Computational Linguistics, Vienna, Austria, 1210–1225

  20. [20]

    Namjun Lee and Jaekwang Kim. 2025. SEALR: Sequential Emotion-Aware LLM- Based Personalized Recommendation System. InProceedings of the 48th Inter- national ACM SIGIR Conference on Research and Development in Information Retrieval(Padua, Italy)(SIGIR ’25). Association for Computing Machinery, New York, NY, USA, 2906–2910

  21. [21]

    Hanyu Li, Weizhi Ma, Peijie Sun, Jiayu Li, Cunxiang Yin, Yancheng He, Guoqiang Xu, Min Zhang, and Shaoping Ma. 2024. Aiming at the Target: Filter Collab- orative Information for Cross-Domain Recommendation. InProceedings of the 47th International ACM SIGIR Conference on Research and Development in Infor- mation Retrieval(Washington DC, USA)(SIGIR ’24). As...

  22. [22]

    Jiacheng Li, Yujie Wang, and Julian McAuley. 2020. Time Interval Aware Self- Attention for Sequential Recommendation. InProceedings of the 13th International Conference on Web Search and Data Mining(Houston, TX, USA)(WSDM ’20). Association for Computing Machinery, New York, NY, USA, 322–330

  23. [23]

    Xuewei Li, Aitong Sun, Mankun Zhao, Jian Yu, Kun Zhu, Di Jin, Mei Yu, and Ruiguo Yu. 2023. Multi-Intention Oriented Contrastive Learning for Sequential Recommendation. InProceedings of the Sixteenth ACM International Conference on Web Search and Data Mining(Singapore, Singapore)(WSDM ’23). Association for Computing Machinery, New York, NY, USA, 411–419

  24. [24]

    Xiaodong Li, Hengzhu Tang, Jiawei Sheng, Xinghua Zhang, Li Gao, Suqi Cheng, Dawei Yin, and Tingwen Liu. 2025. Exploring Preference-Guided Diffusion Model for Cross-Domain Recommendation. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.1(Toronto ON, Canada) (KDD ’25). Association for Computing Machinery, New York, N...

  25. [25]

    Guanyu Lin, Chen Gao, Yu Zheng, Jianxin Chang, Yanan Niu, Yang Song, Kun Gai, Zhiheng Li, Depeng Jin, Yong Li, and Meng Wang. 2024. Mixed Attention Network for Cross-domain Sequential Recommendation. InProceedings of the 17th ACM International Conference on Web Search and Data Mining(Merida, Mexico)(WSDM ’24). Association for Computing Machinery, New York...

  26. [26]

    Qidong Liu, Xian Wu, Wanyu Wang, Yejing Wang, Yuanshao Zhu, Xiangyu Zhao, Feng Tian, and Yefeng Zheng. 2025. LLMEmb: large language model can be a good embedding generator for sequential recommendation. InProceedings of the Thirty- Ninth AAAI Conference on Artificial Intelligence and Thirty-Seventh Conference on Innovative Applications of Artificial Intel...

  27. [27]

    Qidong Liu, Xiangyu Zhao, Yuhao Wang, Yejing Wang, Zijian Zhang, Yuqi Sun, Xi- ang Li, Maolin Wang, Pengyue Jia, Chong Chen, Wei Huang, and Feng Tian. 2025. Large Language Model Enhanced Recommender Systems: Methods, Applications and Trends. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2(Toronto ON, Canada)(KDD ...

  28. [28]

    Qidong Liu, Xiangyu Zhao, Yejing Wang, Zijian Zhang, Howard Zhong, Chong Chen, Xiang Li, Wei Huang, and Feng Tian. 2025. Bridge the Domains: Large Lan- guage Models Enhanced Cross-domain Sequential Recommendation. InProceed- ings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval(Padua, Italy)(SIGIR ’25). A...

  29. [29]

    Weiming Liu, Xiaolin Zheng, Chaochao Chen, Jiajie Su, Xinting Liao, Mengling Hu, and Yanchao Tan. 2023. Joint Internal Multi-Interest Exploration and External Domain Alignment for Cross Domain Sequential Recommendation. InProceedings of the ACM Web Conference 2023(Austin, TX, USA)(WWW ’23). Association for Computing Machinery, New York, NY, USA, 383–394

  30. [30]

    Yuli Liu, Christian Walder, Lexing Xie, and Yiqun Liu. 2024. Probabilistic Atten- tion for Sequential Recommendation. InProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining(Barcelona, Spain)(KDD ’24). Association for Computing Machinery, New York, NY, USA, 1956–1967

  31. [31]

    Ziwei Liu, Qidong Liu, Wanyu Wang, Yejing Wang, Tong Xu, Wei Huang, Chong Chen, Peng Chuan, and Xiangyu Zhao. 2025. LLM-EDT: Large Language Model Enhanced Cross-domain Sequential Recommendation with Dual-phase Training. arXiv:2511.19931 [cs.IR]

  32. [32]

    Fan Lu, Xiaolong Xu, Haolong Xiang, Lianyong Qi, Xiaokang Zhou, Fei Dai, and Wanchun Dou. 2025. CLLMRec: contrastive learning with LLMs-based view augmentation for sequential recommendation. InProceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence(Montreal, Canada)(IJCAI ’25). Article 351, 9 pages

  33. [33]

    Huishi Luo, Yiwen Chen, Yiqing Wu, Fuzhen Zhuang, and Deqing Wang. 2025. One for dozens: adaptive recommendation for all domains with counterfactual augmentation. InProceedings of the Thirty-Ninth AAAI Conference on Artificial Intelligence and Thirty-Seventh Conference on Innovative Applications of Artifi- cial Intelligence and Fifteenth Symposium on Educ...

  34. [34]

    Haokai Ma, Ruobing Xie, Lei Meng, Xin Chen, Xu Zhang, Leyu Lin, and Jie Zhou

  35. [35]

    Triple Sequence Learning for Cross-domain Recommendation.ACM Trans. Inf. Syst.42, 4, Article 91 (Feb. 2024), 29 pages

  36. [36]

    Muyang Ma, Pengjie Ren, Zhumin Chen, Zhaochun Ren, Lifan Zhao, Peiyu Liu, Jun Ma, and Maarten de Rijke. 2022. Mixed Information Flow for Cross-Domain Sequential Recommendations.ACM Trans. Knowl. Discov. Data16, 4, Article 64 (Jan. 2022), 32 pages

  37. [37]

    Muyang Ma, Pengjie Ren, Yujie Lin, Zhumin Chen, Jun Ma, and Maarten de Rijke

  38. [38]

    InProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (Paris, France)(SIGIR’19)

    𝜋-Net: A Parallel Information-sharing Network for Shared-account Cross- domain Sequential Recommendations. InProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (Paris, France)(SIGIR’19). Association for Computing Machinery, New York, NY, SIGIR ’26, July 20–24, 2026, Melbourne, VIC, Australia. Zh...

  39. [39]

    Chung Park, Taesan Kim, Taekyoon Choi, Junui Hong, Yelim Yu, Mincheol Cho, Kyunam Lee, Sungil Ryu, Hyungjun Yoon, Minsung Choi, and Jaegul Choo. 2023. Cracking the Code of Negative Transfer: A Cooperative Game Theoretic Ap- proach for Cross-Domain Sequential Recommendation. InProceedings of the 32nd ACM International Conference on Information and Knowledg...

  40. [40]

    Chung Park, Taesan Kim, Hyungjun Yoon, Junui Hong, Yelim Yu, Mincheol Cho, Minsung Choi, and Jaegul Choo. 2024. Pacer and Runner: Cooperative Learning Framework between Single- and Cross-Domain Sequential Recommendation. In Proceedings of the 47th International ACM SIGIR Conference on Research and Devel- opment in Information Retrieval(Washington DC, USA)...

  41. [42]

    Zhida Qin, Wentao Cheng, Wenxing Ding, and Gangyi Ding. 2025. Hyperbolic Graph Contrastive Learning for Collaborative Filtering.IEEE Transactions on Knowledge and Data Engineering37, 3 (2025), 1255–1267

  42. [43]

    Tingjia Shen, Hao Wang, Jiaqing Zhang, Sirui Zhao, Liangyue Li, Zulong Chen, Defu Lian, and Enhong Chen. 2024. Exploring User Retrieval Integration to- wards Large Language Models for Cross-Domain Sequential Recommendation. arXiv:2406.03085 [cs.LG]

  43. [44]

    Zijian Song, Wenhan Zhang, Lifang Deng, Jiandong Zhang, Zhihua Wu, Kaigui Bian, and Bin Cui. 2024. Mitigating Negative Transfer in Cross-Domain Rec- ommendation via Knowledge Transferability Enhancement. InProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Barcelona, Spain)(KDD ’24). Association for Computing Machinery, ...

  44. [45]

    Wenchao Sun, Muyang Ma, Pengjie Ren, Yujie Lin, Zhumin Chen, Zhaochun Ren, Jun Ma, and Maarten de Rijke. 2023. Parallel Split-Join Networks for Shared Account Cross-Domain Sequential Recommendations.IEEE Transactions on Knowledge and Data Engineering35, 4 (2023), 4106–4123

  45. [46]

    Shuliang Wang, Jiabao Zhu, Yi Wang, Chen Ma, Xin Zhao, Yansen Zhang, Ziqiang Yuan, and Sijie Ruan. 2025. Hierarchical Gating Network for Cross-Domain Sequential Recommendation.ACM Trans. Inf. Syst.43, 4, Article 90 (May 2025), 32 pages

  46. [47]

    Xinhua Wang, Houping Yue, Zizheng Wang, Liancheng Xu, and Jinyu Zhang

  47. [48]

    Unbiased and Robust: External Attention-enhanced Graph Contrastive Learning for Cross-domain Sequential Recommendation.2023 IEEE International Conference on Data Mining Workshops (ICDMW)(2023), 1526–1534

  48. [49]

    Yidan Wang, Xuri Ge, Xin Chen, Ruobing Xie, Su Yan, Xu Zhang, Zhumin Chen, Jun Ma, and Xin Xin. 2025. Exploration and Exploitation of Hard Negative Samples for Cross-Domain Sequential Recommendation. InProceedings of the Eighteenth ACM International Conference on Web Search and Data Mining(Han- nover, Germany)(WSDM ’25). Association for Computing Machiner...

  49. [50]

    Yuhao Wang, Junwei Pan, Pengyue Jia, Wanyu Wang, Maolin Wang, Zhixiang Feng, Xiaotian Li, Jie Jiang, and Xiangyu Zhao. 2025. Pre-train, Align, and Disentangle: Empowering Sequential Recommendation with Large Language Models. InProceedings of the 48th International ACM SIGIR Conference on Research and Development in Information Retrieval(Padua, Italy)(SIGI...

  50. [51]

    Ruobing Xie, Qi Liu, Liangdong Wang, Shukai Liu, Bo Zhang, and Leyu Lin

  51. [52]

    InProceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA)(KDD ’22)

    Contrastive Cross-domain Recommendation in Matching. InProceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA)(KDD ’22). Association for Computing Machinery, New York, NY, USA, 4226–4236

  52. [53]

    Xu Xie, Fei Sun, Zhaoyang Liu, Shiwen Wu, Jinyang Gao, Jiandong Zhang, Bolin Ding, and Bin Cui. 2022. Contrastive Learning for Sequential Recommendation. In2022 IEEE 38th International Conference on Data Engineering (ICDE). 1259–1273

  53. [54]

    Haoran Xin, Ying Sun, Chao Wang, and Hui Xiong. 2025. LLMCDSR: Enhancing Cross-Domain Sequential Recommendation with Large Language Models.ACM Trans. Inf. Syst.43, 5, Article 120 (July 2025), 33 pages

  54. [55]

    Wujiang Xu, Qitian Wu, Runzhong Wang, Mingming Ha, Qiongxu Ma, Linxun Chen, Bing Han, and Junchi Yan. 2024. Rethinking Cross-Domain Sequential Recommendation under Open-World Assumptions. InProceedings of the ACM Web Conference 2024(Singapore, Singapore)(WWW ’24). Association for Computing Machinery, New York, NY, USA, 3173–3184

  55. [56]

    Zitao Xu, Xiaoqing Chen, Weike Pan, and Zhong Ming. 2025. Heterogeneous Graph Transfer Learning for Category-aware Cross-Domain Sequential Recom- mendation. InProceedings of the ACM on Web Conference 2025(Sydney NSW, Australia)(WWW ’25). Association for Computing Machinery, New York, NY, USA, 1951–1962

  56. [57]

    Shenghao Yang, Weizhi Ma, Peijie Sun, Qingyao Ai, Yiqun Liu, Mingchen Cai, and Min Zhang. 2024. Sequential Recommendation with Latent Relations based on Large Language Model. InProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval(Washington DC, USA)(SIGIR ’24). Association for Computing Machinery,...

  57. [58]

    Shixiao Yang, Zhida Qin, Enjun Du, Pengzhan Zhou, and Tianyu Huang. 2024. Dual Social View Enhanced Contrastive Learning for Social Recommendation. IEEE Transactions on Computational Social Systems(2024), 1–15

  58. [59]

    Xiaoxin Ye, Chengkai Huang, Hongtao Huang, and Lina Yao. 2025. Gaussian Mixture Flow Matching with Domain Alignment for Multi-Domain Sequential Recommendation. arXiv:2510.21021 [cs.IR]

  59. [60]

    Mingjia Yin, Hao Wang, Wei Guo, Yong Liu, Zhi Li, Sirui Zhao, Defu Lian, and Enhong Chen. 2024. Learning Partially Aligned Item Representation for Cross- Domain Sequential Recommendation.ArXivabs/2405.12473 (2024)

  60. [61]

    Tianzi Zang, Yanmin Zhu, Ruohan Zhang, Chunyang Wang, Ke Wang, and Jiadi Yu. 2023. Contrastive Multi-view Interest Learning for Cross-domain Sequential Recommendation.ACM Trans. Inf. Syst.42, 3, Article 76 (Dec. 2023), 30 pages

  61. [62]

    Yongfu Zha, Xinxin Dong, Haokai Ma, Yonghui Yang, and Xiaodong Wang. 2025. Align-for-Fusion: Harmonizing Triple Preferences via Dual-oriented Diffusion for Cross-domain Sequential Recommendation.ArXivabs/2508.05074 (2025)

  62. [63]

    Yipeng Zhang, Xin Wang, Hong Chen, and Wenwu Zhu. 2023. Adaptive Disen- tangled Transformer for Sequential Recommendation. InProceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining(Long Beach, CA, USA)(KDD ’23). Association for Computing Machinery, New York, NY, USA, 3434–3445

  63. [64]

    Anhao Zhao, Fanghua Ye, Jinlan Fu, and Xiaoyu Shen. 2024. Unveiling In-Context Learning: A Coordinate System to Understand Its Working Mechanism. InPro- ceedings of the 2024 Conference on Empirical Methods in Natural Language Pro- cessing. Association for Computational Linguistics, Miami, Florida, USA, 12375– 12400

  64. [65]

    Chuang Zhao, Hongke Zhao, Ming HE, Jian Zhang, and Jianping Fan. 2023. Cross- domain recommendation via user interest alignment. InProceedings of the ACM Web Conference 2023(Austin, TX, USA)(WWW ’23). Association for Computing Machinery, New York, NY, USA, 887–896

  65. [66]

    Wei Zhao, Bo Li, and Xian Mo. 2025. Contrastive cross-domain sequential rec- ommendation with attention-aware mechanism.Complex & Intelligent Systems (2025)

  66. [67]

    Wayne Xin Zhao, Kun Zhou, Junyi Li, Tianyi Tang, Xiaolei Wang, Yupeng Hou, Yingqian Min, Beichen Zhang, Junjie Zhang, Zican Dong, Yifan Du, Chen Yang, Yushuo Chen, Zhipeng Chen, Jinhao Jiang, Ruiyang Ren, Yifan Li, Xinyu Tang, Zikang Liu, Peiyu Liu, Jian-Yun Nie, and Ji-Rong Wen. 2025. A Survey of Large Language Models. arXiv:2303.18223 [cs.CL]

  67. [68]

    Donglin Zhou, Xinbei Cai, and Weike Pan. 2025. Contrastive Text-enhanced Transformer for Cross-Domain Sequential Recommendation. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2 (Toronto ON, Canada)(KDD ’25). Association for Computing Machinery, New York, NY, USA, 4110–4119