TopoFisher optimizes trainable filtrations, vectorizations, and compressors in persistent homology to maximize Fisher information, yielding higher information than fixed cosmological summaries and approaching neural baselines with far fewer parameters while generalizing better under simulator shifts
Title resolution pending
9 Pith papers cite this work. Polarity classification is still indexing.
representative citing papers
LLM-generated synthetic datasets steered uniformly across a 2D performance space defined by two landmark algorithms improve meta-learner performance on algorithm selection for regression tasks.
Training per-layer affine probes on frozen transformers yields more reliable latent predictions than the logit lens and enables detection of malicious inputs from prediction trajectories.
BoolXLLM augments an existing Boolean rule learner with LLMs for feature selection, discretization thresholds, and natural-language rule translation to improve interpretability while preserving accuracy.
Dynamic Meta-Metrics learns source-sentence-conditioned combinations of MT metrics, with MLP-based hard and soft clustering versions outperforming static linear and Gaussian process ensembles on WMT data.
Contrastive Activation Addition steers Llama 2 Chat by adding averaged residual-stream activation differences from contrastive example pairs to control targeted behaviors at inference time.
A mirror descent algorithm computes exact Wasserstein barycenters for mixed discrete and continuous input measures with convergence guarantees.
Abundances and Ba isotopic ratios in TYC 6044-714-1 are best reproduced by s+r nucleosynthesis models; i+s+r models require extreme conditions and fail to match the full pattern.
NeuralSet is a scalable Python framework that unifies diverse neural recordings and stimuli with deep learning embeddings via metadata decoupling and lazy data extraction.
citing papers explorer
-
TopoFisher: Learning Topological Summary Statistics by Maximizing Fisher Information
TopoFisher optimizes trainable filtrations, vectorizations, and compressors in persistent homology to maximize Fisher information, yielding higher information than fixed cosmological summaries and approaching neural baselines with far fewer parameters while generalizing better under simulator shifts
-
LLM-Driven Performance-Space Augmentation for Meta-Learning-Based Algorithm Selection
LLM-generated synthetic datasets steered uniformly across a 2D performance space defined by two landmark algorithms improve meta-learner performance on algorithm selection for regression tasks.
-
Eliciting Latent Predictions from Transformers with the Tuned Lens
Training per-layer affine probes on frozen transformers yields more reliable latent predictions than the logit lens and enables detection of malicious inputs from prediction trajectories.
-
BoolXLLM: LLM-Assisted Explainability for Boolean Models
BoolXLLM augments an existing Boolean rule learner with LLMs for feature selection, discretization thresholds, and natural-language rule translation to improve interpretability while preserving accuracy.
-
Dynamic Meta-Metrics: Source-Sentence Conditioned Weighting for MT Evaluation
Dynamic Meta-Metrics learns source-sentence-conditioned combinations of MT metrics, with MLP-based hard and soft clustering versions outperforming static linear and Gaussian process ensembles on WMT data.
-
Steering Llama 2 via Contrastive Activation Addition
Contrastive Activation Addition steers Llama 2 Chat by adding averaged residual-stream activation differences from contrastive example pairs to control targeted behaviors at inference time.
-
A Unified Approach for Computing Wasserstein Barycenters of Discrete and Continuous Measures
A mirror descent algorithm computes exact Wasserstein barycenters for mixed discrete and continuous input measures with convergence guarantees.
-
Observational Signatures and Constraints on the Intermediate Neutron-Capture Process. The Case of the CEMP star TYC 6044-714-1 (RAVE J094921.8-161722)
Abundances and Ba isotopic ratios in TYC 6044-714-1 are best reproduced by s+r nucleosynthesis models; i+s+r models require extreme conditions and fail to match the full pattern.
-
NeuralSet: A High-Performing Python Package for Neuro-AI
NeuralSet is a scalable Python framework that unifies diverse neural recordings and stimuli with deep learning embeddings via metadata decoupling and lazy data extraction.