Projects
Topic: MLRL
-No description- Subtopics: Keywords: |
|
Publications for topic "MLRL" sorted by first author
B
Extreme Gradient Boosted Multi-label Trees for Dynamic Classifier Chains, Knowledge Engineering Group, Technische Universität Darmstadt, number 2006.08094 [cs.LG], ArXiv e-prints, 2020 | , and ,
[URL] |
Extreme Gradient Boosted Multi-label Trees for Dynamic Classifier Chains, in: Discovery Science - 23rd International Conference, {DS} 2020, Thessaloniki, Greece, October 19-21, 2020, Proceedings, pages 471--485, Springer International Publishing, 2020 | , and ,
[DOI] [URL] |
Combining Predictions under Uncertainty: The Case of Random Decision Trees, in: Discovery Science, pages 78--93, Springer, 2021 | , , and ,
[DOI] [URL] |
D
GPU-accelerated Learning of Gradient Boosted Multi-label Classification Rules, TU Darmstadt, 2020 | ,
|
E
Scalable Histogram-based Induction of Gradient Boosted Multi-label Rules, TU Darmstadt, 2021 | ,
|
F
Learning Structured Declarative Rule Sets — A Challenge for Deep Discrete Learning, in: 2nd Workshop on Deep Continuous-Discrete Machine Learning (DeCoDeML), 2020 | , , and ,
[URL] |
H
Optimizing Rule Selection in Random Forest Simplification, TU Darmstadt, 2020 | ,
|
Rule-Based Multi-label Classification: Challenges and Opportunities, in: Rules and Reasoning, pages 3--19, Springer International Publishing, 2020 | , , , and ,
[DOI] [URL] |
Conformal Rule-Based Multi-label Classification, in: KI 2020: Advances in Artificial Intelligence, Springer, Cham, 2020 | , and ,
[DOI] [URL] |
A Flexible Class of Dependence-sensitive Multi-label Loss Functions (2021), in: Machine Learning Journal | , , , and ,
[URL] |
K
Boosted Rule Learning for Multi-Label Classification using Stochastic Gradient Descent, TU Darmstadt, 2021 | ,
|
Efficient Discovery of Expressive Multi-label Rules using Relaxed Pruning, in: Discovery Science, pages 367--382, Springer International Publishing, 2019 | , and ,
[DOI] [URL] |
L
Learning Interpretable Rules for Multi-label Classification, in: Explainable and Interpretable Models in Computer Vision and Machine Learning, pages 81--113, Springer-Verlag, 2018 | , , and ,
[DOI] [URL] |
Learning rules for multi-label classification: a stacking and a separate-and-conquer approach (2016), in: Machine Learning, 105:1(77--126) | and ,
[DOI] |
Tree-Based Dynamic Classifier Chains (2021), in: Machine Learning Journal | , , and ,
[URL] |
N
Learning Context-dependent Label Permutations for Multi-label Classification, in: Proceedings of the 36th International Conference on Machine Learning (ICML-19), pages 4733--4742, {PMLR}, 2019 | , , , , and ,
[URL] |
Reliable Multilabel Classification: Prediction with Partial Abstention (2020), in: Proceedings of the AAAI Conference on Artificial Intelligence, 34:04(5264-5271) | and ,
[DOI] [URL] |
On Aggregation in Ensembles of Multilabel Classifiers, in: Discovery Science, pages 533--547, Springer International Publishing, 2020 | , , , and ,
[DOI] [URL] |
R
Simplifying Random Forests: On the Trade-off between Interpretability and Accuracy, Knowledge Engineering Group, Technische Universität Darmstadt, number 1911.04393, ArXiv e-prints, 2019 | , and ,
[URL] |
Gradient-Based Label Binning in Multi-Label Classification, in: Machine Learning and Knowledge Discovery in Databases (ECML-PKDD), Springer, 2021 | , , and ,
[URL] |
On the Trade-off Between Consistency and Coverage in Multi-label Rule Learning Heuristics, in: Discovery Science, pages 96--111, Springer International Publishing, 2019 | , and ,
[DOI] [URL] |
Learning Gradient Boosted Multi-label Classification Rules, in: Machine Learning and Knowledge Discovery in Databases (ECML-PKDD), pages 124--140, Springer, 2020 | , , , and ,
[DOI] [URL] |
Exploiting Anti-monotonicity of Multi-label Evaluation Measures for Inducing Multi-label Rules, in: PAKDD 2018: Advances in Knowledge Discovery and Data Mining, pages 29--42, Springer International Publishing, 2018 | , and ,
[DOI] [URL] |
W
Multi-target prediction: a unifying view on problems and methods (2019), in: Data Mining and Knowledge Discovery, 33:2(293--324) | , and ,
[DOI] [URL] |