Lógica

Using o-minimality to compute lower bounds on sample complexity of neural networks

Por Alex Usvyatsov (Universidade de Lisboa, CMAF-CIO).

Abstract: I will discuss the concept of sample complexity in statistical learning theory. Then I will show how definability of many hypothesis classes (for example, essentially all artificial neural networks used in practice) in o-minimal structures, helps to compute tighter lower bounds on sample complexity for these hypothesis classes.

Continuous model theory and Banach space geometry: an introduction (part 2)

Por Alexander Usvyatsov (Universidade de Lisboa, CMAF-CIO).

Abstract: These two talks are intended as a soft and not too technical introduction to model theory of metric structures, including a short history and motivating questions, with a particular emphasis on fundamentals of continuous first order logic. I will also mention a few successful applications to Banach space theory, as well as more recent promising directions. 

Páginas