Learning with non-stationary data - application to collaborative filtering and link prediction between name entities in knowledge bases like freebase Subject: The continuous production of tremendous amount of data upsets the traditional view in science and information technology, particularly in machine learning (ML). These data evolve generally over time and, do not follow the fundamental hypothesis of stationarity upon which the learning theory is based. This is for example the case in collaborative filtering where the goal is to generate personalized recommendations for each user. Recommender systems filter out a potentially huge set of items, and extract a subset of N items that best matches user's needs with respect to other users preferences (observed) over existing items and who may have the same tastes than the latter. In this case, user preferences generally evolve over time ; as the perception of different items as well as their popularity are completely time d
نمایش پستها از فوریه, ۲۰۱۴
- سایر برنامهها
INRIA - Lille - France Efficient Sequential Learning in Structured and Constrained Environments, SequeL team, INRIA, Lille, France (PhD position) Keywords: multi-arm bandit, stochastic optimization, reinforcement learning, apprenticeship learning, learning on graphs, transfer learning This Ph.D. program is focused on the problem of sequential learning in structured and constrained environments. The key aspect of this problem is that relatively little knowledge of the environment is available beforehand, and the learner (a virtual or physical agent) has to sequentially interact with environment to learn its structure and then act optimally. This problem encompasses a wide range of applications depending on the definition of the structure of the environment, the sequential nature of the interaction between learner and environment, and the type of constraints . In particular, this Ph.D. program will be driven by two application domains of great scientific and commerci
- سایر برنامهها
During the last years, there have been several breakthroughs in the use of neural networks for natural language processing, in particular using deep architectures. The computer science laboratory of the University of Le Mans (LIUM) is working since many years on statistical machine translation (SMT), and we were among the first researchers to successfully use neural networks, for instance continuous space language and translation models.