Recognition: unknown
Interests Burn-down Diffusion Process for Personalized Collaborative Filtering
Pith reviewed 2026-05-08 15:39 UTC · model grok-4.3
The pith
The interests burn-down diffusion process models the decay of user interests toward candidate items and its reverse produces personalized recommendations in collaborative filtering.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The central claim is that the interests burn-down process delineates the decay of user interests towards candidate items, complemented by its reverse burn-up process that yields personalized recommendation for users. The inherent burn-down nature of this process adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks. The StageCF method built on this process demonstrates effectiveness against existing generative and diffusion-based baseline methods, with further studies validating its capacity to generate personalized interactions.
What carries the argument
The interests burn-down process, a diffusion scheme that models the decay of user interests towards candidate items in interaction systems rather than adding Gaussian noise.
If this is right
- StageCF produces higher-quality personalized recommendations than prior generative and diffusion-based methods.
- The burn-down process captures the diffusive spread of user interests in a manner suited to collaborative filtering.
- The reverse burn-up process directly yields user-specific interaction samples from the decayed state.
- Comprehensive studies confirm the process generates personalized interactions more effectively than conventional diffusion.
Where Pith is reading between the lines
- Domain-specific diffusion processes like burn-down could replace generic noise models in other generative tasks involving discrete user data.
- The approach suggests that modeling interest decay explicitly may reduce reliance on complex noise scheduling in recommendation systems.
- Similar burn-down or burn-up mechanisms might be tested on sequential or session-based recommendation settings where interest evolution is central.
Load-bearing premise
That the mismatch between Gaussian noise and the subtle nature of personalized interaction behavior is the main source of sub-optimal performance in prior diffusion-based collaborative filtering, and that the burn-down process resolves this without introducing comparable mismatches.
What would settle it
An experiment on standard recommendation datasets in which StageCF shows no statistically significant gains in ranking metrics such as Recall@K or NDCG@K over the strongest existing diffusion-based baselines would falsify the central claim.
Figures
read the original abstract
Generative methods have gained widespread attention in Collaborative Filtering (CF) tasks for their ability to produce high-quality personalized samples aligned with users' interests. Among them, diffusion generative models have raised increasing attention in recommendation field. Despite that the pioneering efforts have applied the conventional diffusion process to model diffusive user interests, the incongruity between the Gaussian noise and the subtle nature of user's personalized interaction behavior has led to sub-optimal results. To this end, we introduce a specifically-tailored diffusion scheme for interaction systems, namely the interests burn-down process. The interests burn-down process delineates the decay of user interests towards candidate items, complemented by its reverse burn-up process that yields personalized recommendation for users. The inherent burn-down nature of this process adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks. We present a novel recommendation method StageCF to illustrate the superiority of this newly proposed diffusion process. Experimental results have demonstrated the effectiveness of StageCF against existing generative and diffusion-based baseline methods. Furthermore, comprehensive studies validate the functionality of interests burn-down process, shedding light on its capacity to generate personalized interactions.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript proposes the interests burn-down diffusion process as a domain-specific alternative to Gaussian diffusion for collaborative filtering. It argues that Gaussian noise mismatches the subtle, personalized nature of user-item interactions, leading to sub-optimal prior results. The burn-down process models the decay of user interests toward candidate items, with the reverse burn-up process generating personalized recommendations. The authors introduce StageCF to implement this scheme and claim experimental superiority over generative and diffusion-based baselines, supported by studies validating the process's functionality.
Significance. If the burn-down process is shown to be a mathematically well-defined reversible diffusion (e.g., Markov chain with tractable reverse) that demonstrably better matches binary/sparse interaction data without new mismatches, and if the experiments are robustly validated, the work could advance diffusion-based recommendation by replacing generic Gaussian assumptions with a CF-aligned generative mechanism.
major comments (3)
- [Abstract and §3 (process definition)] The central claim that the burn-down process 'adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks' (abstract) rests on an unshown mathematical definition. No forward-process equations, Markov-chain formulation, closed-form posterior, or variational bound for the reverse burn-up are provided, preventing verification that it is a valid diffusion process avoiding Gaussian mismatch while not introducing equivalent support or variance issues with discrete interactions.
- [§4 (experiments) and §5 (studies)] Experimental claims of superiority over baselines (abstract) lack any reported metrics, statistical tests, data splits, ablation studies, or hyperparameter details. Without these, the asserted effectiveness of StageCF and validation of the burn-down process cannot be assessed and risk circularity (performance on fitted data rather than independent evidence).
- [§2 (related work) and §3] The assumption that Gaussian mismatch is the primary source of sub-optimality in prior diffusion CF, and that the burn-down process resolves it, is not tested against alternatives (e.g., other non-Gaussian noises or discrete diffusion variants). This leaves the novelty and necessity of the invented 'interests burn-down' entity ungrounded.
minor comments (2)
- [Abstract] Notation for the burn-down rate/schedule parameters is introduced but not clearly distinguished from free parameters in the abstract or early sections, risking confusion with the 'parameter-free' aspirations sometimes claimed in diffusion literature.
- [§3] The manuscript would benefit from an explicit comparison table of the burn-down process against standard DDPM forward/reverse steps, including support (continuous vs. discrete) and noise type.
Simulated Author's Rebuttal
We thank the referee for the thorough review and constructive feedback. We address each of the major comments point by point below, providing clarifications and indicating where revisions will be made to the manuscript.
read point-by-point responses
-
Referee: [Abstract and §3 (process definition)] The central claim that the burn-down process 'adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks' (abstract) rests on an unshown mathematical definition. No forward-process equations, Markov-chain formulation, closed-form posterior, or variational bound for the reverse burn-up are provided, preventing verification that it is a valid diffusion process avoiding Gaussian mismatch while not introducing equivalent support or variance issues with discrete interactions.
Authors: We agree that explicit mathematical definitions are essential for verifying the validity of the proposed diffusion process. In the manuscript, the interests burn-down process is formulated as a discrete-time Markov chain where the forward process gradually burns down user interests by probabilistically diminishing interaction strengths according to a schedule, tailored to the binary and sparse nature of CF data. The reverse burn-up process is designed to reconstruct personalized interactions. To facilitate verification, we will add the detailed forward-process equations, the Markov chain transition probabilities, the closed-form posterior distribution, and the evidence lower bound (ELBO) for the reverse process in the revised version of Section 3. This will confirm that the process avoids the Gaussian mismatch while being suitable for discrete interactions. revision: yes
-
Referee: [§4 (experiments) and §5 (studies)] Experimental claims of superiority over baselines (abstract) lack any reported metrics, statistical tests, data splits, ablation studies, or hyperparameter details. Without these, the asserted effectiveness of StageCF and validation of the burn-down process cannot be assessed and risk circularity (performance on fitted data rather than independent evidence).
Authors: The experimental section (§4) reports performance metrics including Recall, NDCG, and HR on standard datasets with comparisons to generative and diffusion baselines. Section 5 includes studies on the burn-down process functionality and some ablations. However, we recognize the value in providing more comprehensive details. In the revision, we will include explicit data split information (e.g., leave-one-out or temporal splits), hyperparameter settings, ablation study results with tables, and statistical significance tests such as paired t-tests with p-values to robustly support the claims and mitigate concerns of circularity. revision: partial
-
Referee: [§2 (related work) and §3] The assumption that Gaussian mismatch is the primary source of sub-optimality in prior diffusion CF, and that the burn-down process resolves it, is not tested against alternatives (e.g., other non-Gaussian noises or discrete diffusion variants). This leaves the novelty and necessity of the invented 'interests burn-down' entity ungrounded.
Authors: Our motivation in §2 highlights the limitations of Gaussian-based diffusion in capturing the subtle, personalized, and discrete nature of user interests in CF, which prior works have not fully addressed. StageCF is compared against multiple diffusion-based methods. To further ground the necessity, we will expand the discussion in the revised §2 and §3 to include comparisons with other non-Gaussian and discrete diffusion approaches from the literature, explaining the unique advantages of the burn-down process for CF tasks without introducing new support issues. revision: partial
Circularity Check
Burn-down process named after the decay it is defined to delineate, then asserted to model diffusive interests by its inherent nature
specific steps
-
self definitional
[Abstract]
"we introduce a specifically-tailored diffusion scheme for interaction systems, namely the interests burn-down process. The interests burn-down process delineates the decay of user interests towards candidate items, complemented by its reverse burn-up process that yields personalized recommendation for users. The inherent burn-down nature of this process adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks."
The scheme is introduced and named precisely for the decay behavior it is said to delineate; the subsequent claim that its 'inherent burn-down nature' models diffusive interests and aligns with CF is then true by the preceding definition, with no separate mathematical construction or proof supplied to establish the diffusion properties independently of the naming.
full rationale
The paper's central derivation introduces a new diffusion scheme by definition as the 'interests burn-down process' that 'delineates the decay of user interests', then immediately claims this nature 'adeptly models the diffusive user interests, aligning seamlessly with the requirements of CF tasks'. This reduces the modeling and alignment claim to a restatement of the chosen name and description, without an independent first-principles derivation, closed-form reverse process, or external verification that the scheme is mathematically a valid diffusion (e.g., Markovian with tractable posterior) rather than a fitted or renamed mechanism. No equations appear in the provided abstract to break the self-definition. The experimental superiority of StageCF is presented as illustration but does not substitute for the missing derivation chain. This matches partial circularity (score 6) but is not total self-citation load-bearing or full equivalence to input data.
Axiom & Free-Parameter Ledger
free parameters (1)
- burn-down rate or schedule parameters
axioms (1)
- domain assumption User interests exhibit a diffusive decay toward candidate items that is better captured by a burn-down than Gaussian process
invented entities (1)
-
interests burn-down process
no independent evidence
Reference graph
Works this paper leans on
-
[1]
Brian DO Anderson. 1982. Reverse-time diffusion equation models.Stochastic Processes and their Applications12, 3 (1982), 313–326. ACM Trans. Inf. Syst., Vol. 1, No. 1, Article . Publication date: May 2026. Interests Burn-down Diffusion Process for Personalized Collaborative Filtering 23
1982
-
[2]
Jacob Austin, Daniel D Johnson, Jonathan Ho, Daniel Tarlow, and Rianne Van Den Berg. 2021. Structured denoising diffusion models in discrete state-spaces.Advances in Neural Information Processing Systems34 (2021), 17981–17993
2021
- [3]
-
[4]
Pierre Baldi. 2012. Autoencoders, unsupervised learning, and deep architectures. InProceedings of ICML workshop on unsupervised and transfer learning. JMLR Workshop and Conference Proceedings, 37–49
2012
-
[5]
Lijian Chen, Wei Yuan, Tong Chen, Guanhua Ye, Nguyen Quoc Viet Hung, and Hongzhi Yin. 2024. Adversarial item promotion on visually-aware recommender systems by guided diffusion.ACM Transactions on Information Systems42, 6 (2024), 1–26
2024
-
[6]
Yicheng Di, Hongjian Shi, Xiaoming Wang, Ruhui Ma, and Yuan Liu. 2025. Federated Recommender System Based on Diffusion Augmentation and Guided Denoising.ACM Trans. Inf. Syst.43, 2, Article 31 (Jan. 2025), 36 pages. doi:10.1145/3688570
- [7]
-
[8]
Maurizio Ferrari Dacrema, Paolo Cremonesi, and Dietmar Jannach. 2019. Are we really making much progress? A worrying analysis of recent neural recommendation approaches. InProceedings of the 13th ACM conference on recommender systems. 101–109
2019
-
[9]
Xiangnan He, Kuan Deng, Xiang Wang, Yan Li, Yongdong Zhang, and Meng Wang. 2020. Lightgcn: Simplifying and powering graph convolution network for recommendation. InProceedings of the 43rd International ACM SIGIR conference on research and development in Information Retrieval. 639–648
2020
-
[10]
Xiangnan He, Zhankui He, Xiaoyu Du, and Tat-Seng Chua. 2018. Adversarial personalized ranking for recommendation. InThe 41st International ACM SIGIR conference on research & development in information retrieval. 355–364
2018
-
[11]
Jonathan Ho, Ajay Jain, and Pieter Abbeel. 2020. Denoising diffusion probabilistic models.Advances in neural information processing systems33 (2020), 6840–6851
2020
-
[12]
Yifan Hu, Yehuda Koren, and Chris Volinsky. 2008. Collaborative filtering for implicit feedback datasets. In2008 Eighth IEEE international conference on data mining. Ieee, 263–272
2008
-
[13]
Han Huang, Leilei Sun, Bowen Du, Yanjie Fu, and Weifeng Lv. 2022. Graphgdp: Generative diffusion processes for permutation invariant graph generation. In2022 IEEE International Conference on Data Mining (ICDM). IEEE, 201–210
2022
-
[14]
Liwei Huang, Yutao Ma, Yanbo Liu, Bohong Danny Du, Shuliang Wang, and Deyi Li. 2023. Position-Enhanced and Time-aware Graph Convolutional Network for Sequential Recommendations.ACM Trans. Inf. Syst.41, 1, Article 6 (Jan. 2023), 32 pages. doi:10.1145/3511700
-
[15]
Gwanghyun Kim, Taesung Kwon, and Jong Chul Ye. 2022. Diffusionclip: Text-guided diffusion models for robust image manipulation. InProceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2426–2435
2022
-
[16]
Diederik P Kingma and Max Welling. 2013. Auto-encoding variational bayes.arXiv preprint arXiv:1312.6114(2013)
work page internal anchor Pith review arXiv 2013
- [17]
- [18]
-
[19]
Qing Li, Kai Hong Luo, QJ Kang, YL He, Q Chen, and Q Liu. 2016. Lattice Boltzmann methods for multiphase flow and phase-change heat transfer.Progress in Energy and Combustion Science52 (2016), 62–105
2016
-
[20]
Sheng Li, Jaya Kawale, and Yun Fu. 2015. Deep collaborative filtering via marginalized denoising auto-encoder. In Proceedings of the 24th ACM international on conference on information and knowledge management. 811–820
2015
-
[21]
Xiaopeng Li and James She. 2017. Collaborative variational autoencoder for recommender systems. InProceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining. 305–314
2017
-
[22]
Xiang Li, John Thickstun, Ishaan Gulrajani, Percy S Liang, and Tatsunori B Hashimoto. 2022. Diffusion-lm improves controllable text generation.Advances in Neural Information Processing Systems35 (2022), 4328–4343
2022
-
[23]
Zihao Li, Aixin Sun, and Chenliang Li. 2023. DiffuRec: A Diffusion Model for Sequential Recommendation.ACM Trans. Inf. Syst.42, 3, Article 66 (Dec. 2023), 28 pages. doi:10.1145/3631116
-
[24]
Dawen Liang, Rahul G Krishnan, Matthew D Hoffman, and Tony Jebara. 2018. Variational autoencoders for collaborative filtering. InProceedings of the 2018 world wide web conference. 689–698
2018
-
[25]
Chengkai Liu, Yangtian Zhang, Jianling Wang, Rex Ying, and James Caverlee. 2025. Flow Matching for Collaborative Filtering. InProceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2(Toronto ON, Canada)(KDD ’25). Association for Computing Machinery, New York, NY, USA, 1765–1775. doi:10.1145/3711896.3736967
-
[26]
Feng Liu, Lixin Zou, Xiangyu Zhao, Min Tang, Liming Dong, Dan Luo, Xiangyang Luo, and Chenliang Li. 2025. Flow matching based sequential recommender model. InProceedings of the Thirty-Fourth International Joint Conference on Artificial Intelligence(Montreal, Canada)(IJCAI ’25). Article 346, 9 pages. doi:10.24963/ijcai.2025/346 ACM Trans. Inf. Syst., Vol. ...
-
[27]
Jiahao Liu, Dongsheng Li, Hansu Gu, Tun Lu, Peng Zhang, Li Shang, and Ning Gu. 2023. Personalized Graph Signal Processing for Collaborative Filtering. InProceedings of the ACM Web Conference 2023(Austin, TX, USA)(WWW ’23). Association for Computing Machinery, New York, NY, USA, 1264–1272. doi:10.1145/3543507.3583466
-
[28]
Kang Liu, Feng Xue, Dan Guo, Le Wu, Shujie Li, and Richang Hong. 2023. MEGCF: Multimodal Entity Graph Collaborative Filtering for Personalized Recommendation.ACM Trans. Inf. Syst.41, 2, Article 30 (April 2023), 27 pages. doi:10.1145/3544106
-
[29]
Kin Gwn Lore, Adedotun Akintayo, and Soumik Sarkar. 2017. LLNet: A deep autoencoder approach to natural low-light image enhancement.Pattern Recognition61 (2017), 650–662
2017
-
[30]
Jianxin Ma, Chang Zhou, Peng Cui, Hongxia Yang, and Wenwu Zhu. 2019. Learning disentangled representations for recommendation.Advances in neural information processing systems32 (2019)
2019
-
[31]
Andrew Ng et al. 2011. Sparse autoencoder.CS294A Lecture notes72, 2011 (2011), 1–19
2011
-
[32]
Alexander Quinn Nichol and Prafulla Dhariwal. 2021. Improved denoising diffusion probabilistic models. InInternational Conference on Machine Learning. PMLR, 8162–8171
2021
-
[33]
Xia Ning and George Karypis. 2011. SLIM: Sparse Linear Methods for Top-N Recommender Systems. In2011 IEEE 11th International Conference on Data Mining. 497–506. doi:10.1109/ICDM.2011.134
-
[34]
Yunchen Pu, Zhe Gan, Ricardo Henao, Xin Yuan, Chunyuan Li, Andrew Stevens, and Lawrence Carin. 2016. Variational autoencoder for deep learning of images, labels and captions.Advances in neural information processing systems29 (2016)
2016
-
[35]
Yifang Qin, Wei Ju, Yiyang Gu, Ziyue Qiao, Zhiping Xiao, and Ming Zhang. 2025. PolyCF: Towards Optimal Spectral Graph Filters for Collaborative Filtering.ACM Trans. Inf. Syst.(April 2025). doi:10.1145/3728464 Just Accepted
-
[36]
Yifang Qin, Hongjun Wu, Wei Ju, Xiao Luo, and Ming Zhang. 2023. A Diffusion Model for POI Recommendation.ACM Trans. Inf. Syst.42, 2, Article 54 (nov 2023), 27 pages. doi:10.1145/3624475
-
[37]
Steffen Rendle, Walid Krichene, Li Zhang, and Yehuda Koren. 2022. Revisiting the performance of ials on item recommendation benchmarks. InProceedings of the 16th ACM Conference on Recommender Systems. 427–435
2022
-
[38]
Everett M Rogers. 2004. A prospective and retrospective look at the diffusion model.Journal of health communication 9, S1 (2004), 13–19
2004
-
[39]
Robin Rombach, Andreas Blattmann, Dominik Lorenz, Patrick Esser, and Björn Ommer. 2022. High-resolution image synthesis with latent diffusion models. InProceedings of the IEEE/CVF conference on computer vision and pattern recognition. 10684–10695
2022
-
[40]
Chitwan Saharia, William Chan, Huiwen Chang, Chris Lee, Jonathan Ho, Tim Salimans, David Fleet, and Mohammad Norouzi. 2022. Palette: Image-to-image diffusion models. InACM SIGGRAPH 2022 Conference Proceedings. 1–10
2022
- [41]
-
[42]
Suvash Sedhain, Aditya Krishna Menon, Scott Sanner, and Lexing Xie. 2015. Autorec: Autoencoders meet collaborative filtering. InProceedings of the 24th international conference on World Wide Web. 111–112
2015
-
[43]
Yifei Shen, Yongji Wu, Yao Zhang, Caihua Shan, Jun Zhang, B Khaled Letaief, and Dongsheng Li. 2021. How powerful is graph convolution for recommendation?. InProceedings of the 30th ACM international conference on information & knowledge management. 1619–1629
2021
-
[44]
Yang Song, Jascha Sohl-Dickstein, Diederik P Kingma, Abhishek Kumar, Stefano Ermon, and Ben Poole. 2020. Score- Based Generative Modeling through Stochastic Differential Equations. InInternational Conference on Learning Repre- sentations
2020
- [45]
-
[46]
Clement Vignac, Igor Krawczuk, Antoine Siraudin, Bohan Wang, Volkan Cevher, and Pascal Frossard. 2022. DiGress: Discrete Denoising diffusion for graph generation. InThe Eleventh International Conference on Learning Representations
2022
-
[47]
Joojo Walker, Ting Zhong, Fengli Zhang, Qiang Gao, and Fan Zhou. 2022. Recommendation via collaborative diffusion generative model. InInternational Conference on Knowledge Science, Engineering and Management. Springer, 593–605
2022
-
[48]
Chenyang Wang, Weizhi Ma, Chong Chen, Min Zhang, Yiqun Liu, and Shaoping Ma. 2023. Sequential Recommendation with Multiple Contrast Signals.ACM Trans. Inf. Syst.41, 1, Article 11 (Jan. 2023), 27 pages. doi:10.1145/3522673
-
[49]
Chao Wang, Hengshu Zhu, Chen Zhu, Chuan Qin, Enhong Chen, and Hui Xiong. 2023. SetRank: A Setwise Bayesian Approach for Collaborative Ranking in Recommender System.ACM Trans. Inf. Syst.42, 2, Article 56 (Nov. 2023), 32 pages. doi:10.1145/3626194
-
[50]
Jun Wang, Lantao Yu, Weinan Zhang, Yu Gong, Yinghui Xu, Benyou Wang, Peng Zhang, and Dell Zhang. 2017. Irgan: A minimax game for unifying generative and discriminative information retrieval models. InProceedings of the 40th International ACM SIGIR conference on Research and Development in Information Retrieval. 515–524
2017
-
[51]
Wenjie Wang, Yiyan Xu, Fuli Feng, Xinyu Lin, Xiangnan He, and Tat-Seng Chua. 2023. Diffusion Recommender Model. arXiv preprint arXiv:2304.04971(2023). ACM Trans. Inf. Syst., Vol. 1, No. 1, Article . Publication date: May 2026. Interests Burn-down Diffusion Process for Personalized Collaborative Filtering 25
-
[52]
Yifan Wang, Yifang Qin, Yu Han, Mingyang Yin, Jingren Zhou, Hongxia Yang, and Ming Zhang. 2022. Ad-aug: Adversarial data augmentation for counterfactual recommendation. InJoint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, 474–490
2022
-
[53]
Zhidan Wang, Wenwen Ye, Xu Chen, Wenqiang Zhang, Zhenlei Wang, Lixin Zou, and Weidong Liu. 2022. Generative session-based recommendation. InProceedings of the ACM Web Conference 2022. 2227–2235
2022
-
[54]
Johannes G Wijmans and Richard W Baker. 1995. The solution-diffusion model: a review.Journal of membrane science 107, 1-2 (1995), 1–21
1995
-
[55]
Yao Wu, Christopher DuBois, Alice X Zheng, and Martin Ester. 2016. Collaborative denoising auto-encoders for top-n recommender systems. InProceedings of the ninth ACM international conference on web search and data mining. 153–162
2016
-
[56]
Lianghao Xia, Chao Huang, Yong Xu, Huance Xu, Xiang Li, and Weiguo Zhang. 2021. Collaborative reflection- augmented autoencoder network for recommender systems.ACM Transactions on Information Systems (TOIS)40, 1 (2021), 1–22
2021
-
[57]
Minkai Xu, Lantao Yu, Yang Song, Chence Shi, Stefano Ermon, and Jian Tang. 2021. GeoDiff: A Geometric Diffusion Model for Molecular Conformation Generation. InInternational Conference on Learning Representations
2021
- [58]
-
[59]
Zichao Yang, Zhiting Hu, Ruslan Salakhutdinov, and Taylor Berg-Kirkpatrick. 2017. Improved variational autoencoders for text modeling using dilated convolutions. InInternational conference on machine learning. PMLR, 3881–3890
2017
-
[60]
Shujian Yu and Jose C Principe. 2019. Understanding autoencoders with information theoretic concepts.Neural Networks117 (2019), 104–123
2019
-
[61]
Meng Yuan, Yutian Xiao, Wei Chen, Chou Zhao, Deqing Wang, and Fuzhen Zhuang. 2025. Hyperbolic Diffusion Recommender Model. InProceedings of the ACM on Web Conference 2025(Sydney NSW, Australia)(WWW ’25). Association for Computing Machinery, New York, NY, USA, 1992–2006. doi:10.1145/3696410.3714873
-
[62]
Shuo Zhang, Xiangwu Meng, and Yujie Zhang. 2024. Variational Type Graph Autoencoder for Denoising on Event Recommendation.ACM Trans. Inf. Syst.43, 1, Article 26 (Dec. 2024), 27 pages. doi:10.1145/3703156
-
[63]
Yi Zhang, Yiwen Zhang, Lei Sang, and Victor S. Sheng. 2024. Simplify to the Limit! Embedding-Less Graph Collaborative Filtering for Recommender Systems.ACM Trans. Inf. Syst.43, 1, Article 22 (Dec. 2024), 30 pages. doi:10.1145/3701230
-
[64]
Jujia Zhao, Wang Wenjie, Yiyan Xu, Teng Sun, Fuli Feng, and Tat-Seng Chua. 2024. Denoising diffusion recommender model. InProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval. 1370–1379
2024
-
[65]
Yuyue Zhao, Xiang Wang, Jiawei Chen, Yashen Wang, Wei Tang, Xiangnan He, and Haiyong Xie. 2022. Time-aware Path Reasoning on Knowledge Graph for Recommendation.ACM Trans. Inf. Syst.41, 2, Article 26 (Dec. 2022), 26 pages. doi:10.1145/3531267 Received 22 February 2025; revised 28 May 2025; revised 17 December 2025; accepted 6 May 2026 ACM Trans. Inf. Syst....
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.