Follow
Andreas Kirsch
Andreas Kirsch
Unknown affiliation
Verified email at google.com - Homepage
Title
Cited by
Cited by
Year
Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning
A Kirsch, J van Amersfoort, Y Gal
Advances in Neural Information Processing Systems (NeurIPS), 7024-7035, 2019
6832019
Deep Deterministic Uncertainty: A New Simple Baseline
J Mukhoti, A Kirsch, J van Amersfoort, PHS Torr, Y Gal
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023
234*2023
Prioritized training on points that are learnable, worth learning, and not yet learnt
S Mindermann, JM Brauner, MT Razzak, M Sharma, A Kirsch, W Xu, ...
International Conference on Machine Learning (ICML), 15630-15649, 2022
1272022
Plex: towards reliability using pretrained large model extensions (2022)
D Tran, J Liu, MW Dusenberry, D Phan, M Collier, J Ren, K Han, Z Wang, ...
URL https://arxiv. org/abs/2207.07411, 0
125*
Prediction-Oriented Bayesian Active Learning
F Bickford Smith, A Kirsch, S Farquhar, Y Gal, A Foster, T Rainforth
International Conference on Artificial Intelligence and Statistics (AISTATS …, 2023
52*2023
Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning
A Kirsch, S Farquhar, P Atighehchian, A Jesson, F Branchaud-Charron, ...
Transactions on Machine Learning Research (TMLR), 2023
45*2023
Causal-bald: Deep bayesian active learning of outcomes to infer treatment-effects from observational data
A Jesson, P Tigas, J van Amersfoort, A Kirsch, U Shalit, Y Gal
Advances in Neural Information Processing Systems (NeurIPS) 34, 30465-30478, 2021
352021
Unpacking information bottlenecks: Unifying information-theoretic objectives in deep learning
A Kirsch, C Lyle, Y Gal
Workshop Uncertainty & Robustness in Deep Learning at Int. Conf. on Machine …, 2020
22*2020
A Note on "Assessing Generalization of SGD via Disagreement"
A Kirsch, Y Gal
Transactions on Machine Learning Research (TMLR), 2022
202022
Unifying Approaches in Active Learning and Active Sampling via Fisher Information and Information-Theoretic Quantities
A Kirsch, Y Gal
Transactions on Machine Learning Research (TMLR), 2022
19*2022
Black-Box Batch Active Learning for Regression
A Kirsch
Transactions on Machine Learning Research, 2023
82023
Does Deep Learning on a Data Diet reproduce? Overall yes, but GraNd at Initialization does not
A Kirsch
Transactions on Machine Learning Research, 2023
72023
Advancing Deep Active Learning & Data Subset Selection: Unifying Principles with Information-Theory Intuitions
A Kirsch
arXiv preprint arXiv:2401.04305, 2024
62024
A Practical & Unified Notation for Information-Theoretic Quantities in ML
A Kirsch, Y Gal
arXiv preprint arXiv:2106.12062, 2021
42021
CoLoR-Filter: Conditional Loss Reduction Filtering for Targeted Language Model Pre-training
D Brandfonbrener, H Zhang, A Kirsch, JR Schwarz, S Kakade
arXiv preprint arXiv:2406.10670, 2024
32024
MDP environments for the OpenAI Gym
A Kirsch
arXiv preprint arXiv:1709.09069, 2017
22017
Speeding Up BatchBALD: A k-BALD Family of Approximations for Active Learning
A Kirsch
arXiv preprint arXiv:2301.09490, 2023
12023
Marginal and joint cross-entropies & predictives for online Bayesian inference, active learning, and active sampling
A Kirsch, J Kossen, Y Gal
arXiv preprint arXiv:2205.08766, 2022
12022
Effiziente Implementierung von mehrzentrigen molekulardynamischen Potentialmodellen mit Cuda
A Kirsch
Studienarbeit/sep/idp, Institut für Informatik, Technische Universität München, 2011
12011
All models are wrong, some are useful: Model Selection with Limited Labels
P Okanovic, A Kirsch, J Kasper, T Hoefler, A Krause, NM Gürel
arXiv preprint arXiv:2410.13609, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–20