![]() | 2009 | |
---|---|---|
140 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, S. V. N. Vishwanathan: Tutorial summary: Survey of boosting from an optimization perspective. ICML 2009: 175 |
2008 | ||
139 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Karen A. Glocer, S. V. N. Vishwanathan: Entropy Regularized LPBoost. ALT 2008: 256-271 |
138 | ![]() ![]() ![]() ![]() ![]() ![]() | Jacob Abernethy, Manfred K. Warmuth, Joel Yellin: When Random Play is Optimal Against an Adversary. COLT 2008: 437-446 |
137 | ![]() ![]() ![]() ![]() ![]() ![]() | Adam M. Smith, Manfred K. Warmuth: Learning Rotations. COLT 2008: 517 |
2007 | ||
136 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Manfred K. Warmuth: Learning Permutations with Exponential Weights. COLT 2007: 469-483 |
135 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: When Is There a Free Matrix Lunch? COLT 2007: 630-632 |
134 | ![]() ![]() ![]() ![]() ![]() ![]() | Dima Kuzmin, Manfred K. Warmuth: Online kernel PCA with entropic matrix updates. ICML 2007: 465-472 |
133 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Winnowing subspaces. ICML 2007: 999-1006 |
132 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Karen A. Glocer, Gunnar Rätsch: Boosting Algorithms for Maximizing the Soft Margin. NIPS 2007 |
2006 | ||
131 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Dima Kuzmin: Online Variance Minimization. COLT 2006: 514-528 |
130 | ![]() ![]() ![]() ![]() ![]() ![]() | Jacob Abernethy, John Langford, Manfred K. Warmuth: Continuous Experts and the Binning Algorithm. COLT 2006: 544-558 |
129 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Can Entropic Regularization Be Replaced by Squared Euclidean Distance Plus Additional Linear Constraints. COLT 2006: 653-654 |
128 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Jun Liao, Gunnar Rätsch: Totally corrective boosting algorithms that maximize the margin. ICML 2006: 1001-1008 |
127 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Dima Kuzmin: Randomized PCA Algorithms with Regret Bounds that are Logarithmic in the Dimension. NIPS 2006: 1481-1488 |
126 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Dima Kuzmin: A Bayesian Probability Calculus for Density Matrices. UAI 2006 |
125 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: A Bayesian Probability Calculus for Density Matrices. UAI 2006 |
124 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth, Babak Hassibi: The p-norm generalization of the LMS algorithm for adaptive filtering. IEEE Transactions on Signal Processing 54(5): 1782-1793 (2006) |
2005 | ||
123 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, S. V. N. Vishwanathan: Leaving the Span. COLT 2005: 366-381 |
122 | ![]() ![]() ![]() ![]() ![]() ![]() | Dima Kuzmin, Manfred K. Warmuth: Unlabeled Compression Schemes for Maximum Classes, . COLT 2005: 591-605 |
121 | ![]() ![]() ![]() ![]() ![]() ![]() | Dima Kuzmin, Manfred K. Warmuth: Optimum Follow the Leader Algorithm. COLT 2005: 684-686 |
120 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: A Bayes Rule for Density Matrices. NIPS 2005 |
119 | ![]() ![]() ![]() ![]() ![]() ![]() | Gunnar Rätsch, Manfred K. Warmuth: Efficient Margin Maximizing with Boosting. Journal of Machine Learning Research 6: 2131-2152 (2005) |
118 | ![]() ![]() ![]() ![]() ![]() ![]() | Koji Tsuda, Gunnar Rätsch, Manfred K. Warmuth: Matrix Exponentiated Gradient Updates for On-line Learning and Bregman Projection. Journal of Machine Learning Research 6: 995-1018 (2005) |
2004 | ||
117 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: The Optimal PAC Algorithm. COLT 2004: 641-642 |
116 | ![]() ![]() ![]() ![]() ![]() ![]() | Koji Tsuda, Gunnar Rätsch, Manfred K. Warmuth: Matrix Exponential Gradient Updates for On-line Learning and Bregman Projection. NIPS 2004 |
2003 | ||
115 | ![]() ![]() ![]() ![]() ![]() ![]() | Bernhard Schölkopf, Manfred K. Warmuth: Computational Learning Theory and Kernel Machines, 16th Annual Conference on Computational Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003, Washington, DC, USA, August 24-27, 2003, Proceedings Springer 2003 |
114 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Compressing to VC Dimension Many Points. COLT 2003: 743-744 |
113 | ![]() ![]() ![]() ![]() ![]() ![]() | Kohei Hatano, Manfred K. Warmuth: Boosting versus Covering. NIPS 2003 |
112 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Jun Liao, Gunnar Rätsch, Michael Mathieson, Santosh Putta, Christian Lemmen: Active Learning with Support Vector Machines in the Drug Discovery Process. Journal of Chemical Information and Computer Sciences 43(2): 667-673 (2003) |
111 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: Path Kernels and Multiplicative Updates. Journal of Machine Learning Research 4: 773-818 (2003) |
110 | ![]() ![]() ![]() ![]() ![]() ![]() | Jürgen Forster, Manfred K. Warmuth: Relative Loss Bounds for Temporal-Difference Learning. Machine Learning 51(1): 23-50 (2003) |
2002 | ||
109 | ![]() ![]() ![]() ![]() ![]() ![]() | Gunnar Rätsch, Manfred K. Warmuth: Maximizing the Margin with Boosting. COLT 2002: 334-350 |
108 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: Path Kernels and Multiplicative Updates. COLT 2002: 74-89 |
107 | ![]() ![]() ![]() ![]() ![]() ![]() | Robert B. Gramacy, Manfred K. Warmuth, Scott A. Brandt, Ismail Ari: Adaptive Caching by Refetching. NIPS 2002: 1465-1472 |
106 | ![]() ![]() ![]() ![]() ![]() ![]() | Jürgen Forster, Manfred K. Warmuth: Relative Expected Instantaneous Loss Bounds. J. Comput. Syst. Sci. 64(1): 76-102 (2002) |
105 | ![]() ![]() ![]() ![]() ![]() ![]() | Olivier Bousquet, Manfred K. Warmuth: Tracking a Small Set of Experts by Mixing Past Posteriors. Journal of Machine Learning Research 3: 363-396 (2002) |
104 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Sandra Panizza, Manfred K. Warmuth: Direct and indirect algorithms for on-line learning of disjunctions. Theor. Comput. Sci. 284(1): 109-142 (2002) |
103 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: Predicting nearly as well as the best pruning of a planar decision graph. Theor. Comput. Sci. 288(2): 217-235 (2002) |
2001 | ||
102 | ![]() ![]() ![]() ![]() ![]() ![]() | Olivier Bousquet, Manfred K. Warmuth: Tracking a Small Set of Experts by Mixing Past Posteriors. COLT/EuroCOLT 2001: 31-47 |
101 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, Gunnar Rätsch, Michael Mathieson, Jun Liao, Christian Lemmen: Active Learning in the Drug Discovery Process. NIPS 2001: 1449-1456 |
100 | ![]() ![]() ![]() ![]() ![]() ![]() | Gunnar Rätsch, Sebastian Mika, Manfred K. Warmuth: On the Convergence of Leveraging. NIPS 2001: 487-494 |
99 | ![]() ![]() ![]() ![]() ![]() ![]() | Mark Herbster, Manfred K. Warmuth: Tracking the Best Linear Predictor. Journal of Machine Learning Research 1: 281-309 (2001) |
98 | ![]() ![]() ![]() ![]() ![]() ![]() | Katy S. Azoury, Manfred K. Warmuth: Relative Loss Bounds for On-Line Density Estimation with the Exponential Family of Distributions. Machine Learning 43(3): 211-246 (2001) |
97 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Relative Loss Bounds for Multidimensional Regression Problems. Machine Learning 45(3): 301-329 (2001) |
2000 | ||
96 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: The Last-Step Minimax Algorithm. ALT 2000: 279-290 |
95 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: The Minimax Strategy for Gaussian Density Estimation. pp. COLT 2000: 100-106 |
94 | ![]() ![]() ![]() ![]() ![]() ![]() | Gunnar Rätsch, Manfred K. Warmuth, Sebastian Mika, Takashi Onoda, Steven Lemm, Klaus-Robert Müller: Barrier Boosting. COLT 2000: 170-179 |
93 | ![]() ![]() ![]() ![]() ![]() ![]() | Jürgen Forster, Manfred K. Warmuth: Relative Expected Instantaneous Loss Bounds. COLT 2000: 90-99 |
92 | ![]() ![]() ![]() ![]() ![]() ![]() | Jürgen Forster, Manfred K. Warmuth: Relative Loss Bounds for Temporal-Difference Learning. ICML 2000: 295-302 |
91 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Stephen Kwek, Wolfgang Maass, Manfred K. Warmuth: Learning of Depth Two Neural Networks with Constant Fan-in at the Hidden Nodes Electronic Colloquium on Computational Complexity (ECCC) 7(55): (2000) |
90 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Manfred K. Warmuth: Tracking the best disjunction Electronic Colloquium on Computational Complexity (ECCC) 7(70): (2000) |
1999 | ||
89 | ![]() ![]() ![]() ![]() ![]() ![]() | Eiji Takimoto, Manfred K. Warmuth: Predicting Nearly as well as the best Pruning of a Planar Decision Graph. ATL 1999: 335-346 |
88 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Boosting as Entropy Projection. COLT 1999: 134-144 |
87 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Sandra Panizza, Manfred K. Warmuth: Direct and Indirect Algorithms for On-line Learning of Disjunctions. EuroCOLT 1999: 138-152 |
86 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Averaging Expert Predictions EuroCOLT 1999: 153-167 |
85 | ![]() ![]() ![]() ![]() ![]() ![]() | Katy S. Azoury, Manfred K. Warmuth: Relative Loss Bounds for On-line Density Estirnation with the Exponential Family of Distributions. UAI 1999: 31-40 |
84 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Jyrki Kivinen, Manfred K. Warmuth: Relative loss bounds for single neurons. IEEE Transactions on Neural Networks 10(6): 1291-1304 (1999) |
1998 | ||
83 | ![]() ![]() ![]() ![]() ![]() ![]() | Mark Herbster, Manfred K. Warmuth: Tracking the Best Regressor. COLT 1998: 24-31 |
82 | ![]() ![]() ![]() ![]() ![]() ![]() | Claudio Gentile, Manfred K. Warmuth: Linear Hinge Loss and Average Margin. NIPS 1998: 225-231 |
81 | ![]() ![]() ![]() ![]() ![]() ![]() | Yoram Singer, Manfred K. Warmuth: Batch and On-Line Parameter Estimation of Gaussian Mixtures Based on the Joint Entropy. NIPS 1998: 578-584 |
80 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Jyrki Kivinen, Manfred K. Warmuth: Sequential Prediction of Individual Sequences Under General Loss Functions. IEEE Transactions on Information Theory 44(5): 1906-1925 (1998) |
79 | ![]() ![]() ![]() ![]() ![]() ![]() | Wolfgang Maass, Manfred K. Warmuth: Efficient Learning With Virtual Threshold Gates. Inf. Comput. 141(1): 66-83 (1998) |
78 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Manfred K. Warmuth: Tracking the Best Disjunction. Machine Learning 32(2): 127-150 (1998) |
77 | ![]() ![]() ![]() ![]() ![]() ![]() | Mark Herbster, Manfred K. Warmuth: Tracking the Best Expert. Machine Learning 32(2): 151-178 (1998) |
1997 | ||
76 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension. EuroCOLT 1997: 1-2 |
75 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Relative Loss Bounds for Multidimensional Regression Problems. NIPS 1997 |
74 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Relative Loss Bounds, the Minimum Relative Entropy Principle, and EM. NIPS 1997 |
73 | ![]() ![]() ![]() ![]() ![]() ![]() | Yoav Freund, Robert E. Schapire, Yoram Singer, Manfred K. Warmuth: Using and Combining Predictors That Specialize. STOC 1997: 334-343 |
72 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth, Peter Auer: The Perceptron Algorithm Versus Winnow: Linear Versus Logarithmic Mistake Bounds when Few Input Variables are Relevant (Technical Note). Artif. Intell. 97(1-2): 325-343 (1997) |
71 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Exponentiated Gradient Versus Gradient Descent for Linear Predictors. Inf. Comput. 132(1): 1-63 (1997) |
70 | ![]() ![]() ![]() ![]() ![]() ![]() | Nicolò Cesa-Bianchi, Yoav Freund, David Haussler, David P. Helmbold, Robert E. Schapire, Manfred K. Warmuth: How to use expert advice. J. ACM 44(3): 427-485 (1997) |
69 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert E. Schapire, Yoram Singer, Manfred K. Warmuth: A Comparison of New and Old Algorithms for a Mixture Estimation Problem. Machine Learning 27(1): 97-119 (1997) |
1996 | ||
68 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Stephen Kwek, Wolfgang Maass, Manfred K. Warmuth: Learning of Depth Two Neural Networks with Constant Fan-In at the Hidden Nodes (Extended Abstract). COLT 1996: 333-343 |
67 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert E. Schapire, Yoram Singer, Manfred K. Warmuth: On-Line Portfolio Selection Using Multiplicative Updates. ICML 1996: 243-251 |
66 | ![]() ![]() ![]() ![]() ![]() ![]() | Yoram Singer, Manfred K. Warmuth: Training Algorithms for Hidden Markov Models using Entropy Based Distance Functions. NIPS 1996: 641-647 |
65 | ![]() ![]() ![]() ![]() ![]() ![]() | Robert E. Schapire, Manfred K. Warmuth: On the Worst-Case Analysis of Temporal-Difference Learning Algorithms. Machine Learning 22(1-3): 95-121 (1996) |
64 | ![]() ![]() ![]() ![]() ![]() ![]() | Nicolò Cesa-Bianchi, Yoav Freund, David P. Helmbold, Manfred K. Warmuth: On-line Prediction and Conversion Strategies. Machine Learning 25(1): 71-110 (1996) |
1995 | ||
63 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: The Perceptron Algorithm vs. Winnow: Linear vs. Logarithmic Mistake Bounds when few Input Variables are Relevant. COLT 1995: 289-296 |
62 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Yoram Singer, Robert E. Schapire, Manfred K. Warmuth: A Comparison of New and Old Algorithms for a Mixture Estimation Problem. COLT 1995: 69-78 |
61 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Jyrki Kivinen, Manfred K. Warmuth: Tight worst-case loss bounds for predicting with expert advice. EuroCOLT 1995: 69-83 |
60 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Manfred K. Warmuth: Tracking the Best Disjunction. FOCS 1995: 312-321 |
59 | ![]() ![]() ![]() ![]() ![]() ![]() | Mark Herbster, Manfred K. Warmuth: Tracking the Best Expert. ICML 1995: 286-294 |
58 | ![]() ![]() ![]() ![]() ![]() ![]() | Wolfgang Maass, Manfred K. Warmuth: Efficient Learning with Virtual Threshold Gates. ICML 1995: 378-386 |
57 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Jyrki Kivinen, Manfred K. Warmuth: Worst-case Loss Bounds for Single Neurons. NIPS 1995: 309-315 |
56 | ![]() ![]() ![]() ![]() ![]() ![]() | Peter Auer, Mark Herbster, Manfred K. Warmuth: Exponentially many local minima for single neurons. NIPS 1995: 316-322 |
55 | ![]() ![]() ![]() ![]() ![]() ![]() | Jyrki Kivinen, Manfred K. Warmuth: Additive versus exponentiated gradient updates for linear prediction. STOC 1995: 209-218 |
54 | ![]() ![]() ![]() ![]() ![]() ![]() | Nick Littlestone, Philip M. Long, Manfred K. Warmuth: On-line Learning of Linear Functions. Computational Complexity 5(1): 1-23 (1995) |
53 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Manfred K. Warmuth: On Weak Learning. J. Comput. Syst. Sci. 50(3): 551-573 (1995) |
52 | ![]() ![]() ![]() ![]() ![]() ![]() | Sally A. Goldman, Manfred K. Warmuth: Learning Binary Relations Using Weighted Majority Voting. Machine Learning 20(3): 245-271 (1995) |
51 | ![]() ![]() ![]() ![]() ![]() ![]() | Sally Floyd, Manfred K. Warmuth: Sample Compression, Learnability, and the Vapnik-Chervonenkis Dimension. Machine Learning 21(3): 269-304 (1995) |
1994 | ||
50 | ![]() ![]() ![]() ![]() ![]() ![]() | Robert E. Schapire, Manfred K. Warmuth: On the Worst-Case Analysis of Temporal-Difference Learning Algorithms. ICML 1994: 266-274 |
49 | ![]() ![]() ![]() ![]() ![]() ![]() | Nicolò Cesa-Bianchi, Anders Krogh, Manfred K. Warmuth: Bounds on approximate steepest descent for likelihood maximization in exponential families. IEEE Transactions on Information Theory 40(4): 1215- (1994) |
48 | ![]() ![]() ![]() ![]() ![]() ![]() | Hans L. Bodlaender, Shlomo Moran, Manfred K. Warmuth: The Distributed Bit Complexity of the Ring: From the Anonymous to the Non-anonymous Case Inf. Comput. 108(1): 34-50 (1994) |
47 | ![]() ![]() ![]() ![]() ![]() ![]() | Nick Littlestone, Manfred K. Warmuth: The Weighted Majority Algorithm Inf. Comput. 108(2): 212-261 (1994) |
46 | ![]() ![]() ![]() ![]() ![]() ![]() | Philip M. Long, Manfred K. Warmuth: Composite Geometric Concepts and Polynomial Predictability Inf. Comput. 113(2): 230-252 (1994) |
45 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Nick Littlestone, Manfred K. Warmuth: Predicting \0,1\-Functions on Randomly Drawn Points Inf. Comput. 115(2): 248-292 (1994) |
1993 | ||
44 | ![]() ![]() ![]() ![]() ![]() ![]() | Nicolò Cesa-Bianchi, Philip M. Long, Manfred K. Warmuth: Worst-Case Quadratic Loss Bounds for a Generalization of the Widrow-Hoff Rule. COLT 1993: 429-438 |
43 | ![]() ![]() ![]() ![]() ![]() ![]() | Sally A. Goldman, Manfred K. Warmuth: Learning Binary Relations Using Weighted Majority Voting. COLT 1993: 453-462 |
42 | ![]() ![]() ![]() ![]() ![]() ![]() | Nicolò Cesa-Bianchi, Yoav Freund, David P. Helmbold, David Haussler, Robert E. Schapire, Manfred K. Warmuth: How to use expert advice. STOC 1993: 382-391 |
41 | ![]() ![]() ![]() ![]() ![]() ![]() | Leonard Pitt, Manfred K. Warmuth: The Minimum Consistent DFA Problem Cannot be Approximated within any Polynomial. J. ACM 40(1): 95-142 (1993) |
40 | ![]() ![]() ![]() ![]() ![]() ![]() | Shlomo Moran, Manfred K. Warmuth: Gap Theorems for Distributed Computation. SIAM J. Comput. 22(2): 379-394 (1993) |
1992 | ||
39 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Manfred K. Warmuth: Some Weak Learning Results. COLT 1992: 399-412 |
38 | ![]() ![]() ![]() ![]() ![]() ![]() | Naoki Abe, Manfred K. Warmuth: On the Computational Complexity of Approximating Distributions by Probabilistic Automata. Machine Learning 9: 205-260 (1992) |
37 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert H. Sloan, Manfred K. Warmuth: Learning Integer Lattices. SIAM J. Comput. 21(2): 240-266 (1992) |
1991 | ||
36 | ![]() ![]() ![]() ![]() ![]() ![]() | Naoki Abe, Manfred K. Warmuth, Jun-ichi Takeuchi: Polynomial Learnability of Probabilistic Concepts with Respect to the Kullback-Leibler Divergence. COLT 1991: 277-289 |
35 | ![]() ![]() ![]() ![]() ![]() ![]() | Nick Littlestone, Philip M. Long, Manfred K. Warmuth: On-Line Learning of Linear Functions STOC 1991: 465-475 |
34 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Michael J. Kearns, Nick Littlestone, Manfred K. Warmuth: Equivalence of Models for Polynomial Learnability Inf. Comput. 95(2): 129-161 (1991) |
1990 | ||
33 | ![]() ![]() ![]() ![]() ![]() ![]() | Philip M. Long, Manfred K. Warmuth: Composite Geometric Concepts and Polynomial Predictability. COLT 1990: 273-287 |
32 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert H. Sloan, Manfred K. Warmuth: Learning Integer Lattices. COLT 1990: 288-302 |
31 | ![]() ![]() ![]() ![]() ![]() ![]() | Naoki Abe, Manfred K. Warmuth: On the Computational Complexity of Approximating Distributions by Probabilistic Automata. COLT 1990: 52-66 |
30 | ![]() ![]() ![]() ![]() ![]() ![]() | Leonard Pitt, Manfred K. Warmuth: Prediction-Preserving Reducibility. J. Comput. Syst. Sci. 41(3): 430-467 (1990) |
29 | ![]() ![]() ![]() ![]() ![]() ![]() | Daniel Ratner, Manfred K. Warmuth: NxN Puzzle and Related Relocation Problem. J. Symb. Comput. 10(2): 111-138 (1990) |
28 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert H. Sloan, Manfred K. Warmuth: Learning Nested Differences of Intersection-Closed Concept Classes. Machine Learning 5: 165-196 (1990) |
1989 | ||
27 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth: Towards Representation Independence in PAC Learning. AII 1989: 78-103 |
26 | ![]() ![]() ![]() ![]() ![]() ![]() | David P. Helmbold, Robert H. Sloan, Manfred K. Warmuth: Learning Nested Differences of Intersection-Closed Concept Classes. COLT 1989: 41-56 |
25 | ![]() ![]() ![]() ![]() ![]() ![]() | Hans L. Bodlaender, Shlomo Moran, Manfred K. Warmuth: The Distributed Bit Complexity of the Ring: From the Anonymous to the Non-anonymous Case. FCT 1989: 58-67 |
24 | ![]() ![]() ![]() ![]() ![]() ![]() | Nick Littlestone, Manfred K. Warmuth: The Weighted Majority Algorithm FOCS 1989: 256-261 |
23 | ![]() ![]() ![]() ![]() ![]() ![]() | Leonard Pitt, Manfred K. Warmuth: The Minimum Consistent DFA Problem Cannot Be Approximated within any Polynomial STOC 1989: 421-432 |
22 | ![]() ![]() ![]() ![]() ![]() ![]() | Leonard Pitt, Manfred K. Warmuth: The Minimum Consistent DFA Problem Cannot be Approximated within any Polynomial (abstract). Structure in Complexity Theory Conference 1989: 230 |
21 | ![]() ![]() ![]() ![]() ![]() ![]() | Jakob Gonczarowski, Manfred K. Warmuth: Scattered Versus Context-Sensitive Rewriting. Acta Inf. 27(1): 81-95 (1989) |
20 | ![]() ![]() ![]() ![]() ![]() ![]() | Richard J. Anderson, Ernst W. Mayr, Manfred K. Warmuth: Parallel Approximation Algorithms for Bin Packing Inf. Comput. 82(3): 262-277 (1989) |
19 | ![]() ![]() ![]() ![]() ![]() ![]() | Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, Manfred K. Warmuth: Learnability and the Vapnik-Chervonenkis dimension. J. ACM 36(4): 929-965 (1989) |
18 | ![]() ![]() ![]() ![]() ![]() ![]() | Barbara B. Simons, Manfred K. Warmuth: A Fast Algorithm for Multiprocessor Scheduling of Unit-Length Jobs. SIAM J. Comput. 18(4): 690-710 (1989) |
1988 | ||
17 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Nick Littlestone, Manfred K. Warmuth: Predicting {0, 1}-Functions on Randomly Drawn Points. COLT 1988: 280-296 |
16 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Michael J. Kearns, Nick Littlestone, Manfred K. Warmuth: Equivalence of Models for Polynomial Learnability. COLT 1988: 42-55 |
15 | ![]() ![]() ![]() ![]() ![]() ![]() | David Haussler, Nick Littlestone, Manfred K. Warmuth: Predicting {0,1}-Functions on Randomly Drawn Points (Extended Abstract) FOCS 1988: 100-109 |
14 | ![]() ![]() ![]() ![]() ![]() ![]() | Hagit Attiya, Marc Snir, Manfred K. Warmuth: Computing on an anonymous ring. J. ACM 35(4): 845-875 (1988) |
1987 | ||
13 | ![]() ![]() ![]() ![]() ![]() ![]() | Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, Manfred K. Warmuth: Occam's Razor. Inf. Process. Lett. 24(6): 377-380 (1987) |
1986 | ||
12 | ![]() ![]() ![]() ![]() ![]() ![]() | Daniel Ratner, Manfred K. Warmuth: Finding a Shortest Solution for the N × N Extension of the 15-PUZZLE Is Intractable. AAAI 1986: 168-172 |
11 | ![]() ![]() ![]() ![]() ![]() ![]() | Elias Dahlhaus, Manfred K. Warmuth: Membership for Growing Context Sensitive Grammars is Polynomial. CAAP 1986: 85-99 |
10 | ![]() ![]() ![]() ![]() ![]() ![]() | Shlomo Moran, Manfred K. Warmuth: Gap Theorems for Distributed Computation. PODC 1986: 131-140 |
9 | ![]() ![]() ![]() ![]() ![]() ![]() | Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, Manfred K. Warmuth: Classifying Learnable Geometric Concepts with the Vapnik-Chervonenkis Dimension (Extended Abstract) STOC 1986: 273-282 |
8 | ![]() ![]() ![]() ![]() ![]() ![]() | Elias Dahlhaus, Manfred K. Warmuth: Membership for Growing Context-Sensitive Grammars is Polynomial. J. Comput. Syst. Sci. 33(3): 456-472 (1986) |
7 | ![]() ![]() ![]() ![]() ![]() ![]() | Danny Dolev, Eli Upfal, Manfred K. Warmuth: The Parallel Complexity of Scheduling with Precedence Constraints. J. Parallel Distrib. Comput. 3(4): 553-576 (1986) |
6 | ![]() ![]() ![]() ![]() ![]() ![]() | Jakob Gonczarowski, Manfred K. Warmuth: Manipulating Derivation Forests by Scheduling Techniques. Theor. Comput. Sci. 45(1): 87-119 (1986) |
1985 | ||
5 | ![]() ![]() ![]() ![]() ![]() ![]() | Chagit Attiya, Marc Snir, Manfred K. Warmuth: Computing on an Anonymous Ring. PODC 1985: 196-203 |
4 | ![]() ![]() ![]() ![]() ![]() ![]() | Danny Dolev, Manfred K. Warmuth: Scheduling Flat Graphs. SIAM J. Comput. 14(3): 638-657 (1985) |
3 | ![]() ![]() ![]() ![]() ![]() ![]() | Jakob Gonczarowski, Manfred K. Warmuth: Applications of Scheduling Theory to Formal Language Theory. Theor. Comput. Sci. 37: 217-243 (1985) |
1984 | ||
2 | ![]() ![]() ![]() ![]() ![]() ![]() | Danny Dolev, Manfred K. Warmuth: Scheduling Precedence Graphs of Bounded Height. J. Algorithms 5(1): 48-59 (1984) |
1 | ![]() ![]() ![]() ![]() ![]() ![]() | Manfred K. Warmuth, David Haussler: On the Complexity of Iterated Shuffle. J. Comput. Syst. Sci. 28(3): 345-358 (1984) |