**2019**

#### NP-Hardness of minimum expected coverage

PATTERN RECOGNITION LETTERS, v. 117, p. 45-51, 2019

**Pesquisadores: **
Lucas Henrique Sousa Mello
Flávio M. Varejão
Alexandre L. Rodrigues
Thomas W. Rauber

In multi-label learning a single object can be associated with multiple labels simultaneously. In a context where labels follow a random distribution, every labelling has a probability of occurrence. Thus, any prediction is associated with an expected error measured by a predefined loss function. From an exponential number of possible labellings, an algorithm should choose the prediction that minimizes the expected error. This is known as loss minimization. This work shows a proof of the NP-completeness, with respect to the number of labels, of a specific case of the loss minimization of the Coverage loss function, which allows to conclude that the general case is NP-hard.