Internships

265 – Research Assistant, Multimodal Representation Learning of Human Behaviors/Machine Learning

Project Name
Multimodal Representation Learning of Human Behaviors/Machine Learning

Project Description
Machine Learning in general, relies heavily on good representations or features of data that yield better discriminatory capability in classification and regression experiments. In order to derive efficient representations of data, researchers have adopted two main strategies: (1) Manually crafted feature extractors designed for a particular task and (2) Algorithms that derive representations automatically from the data itself. The latter approach is called Representation Learning (RL), and has received growing attention because of increasing availability of both data as well as computational resources. In fact, RL was responsible for large performance boosts in a number of machine learning applications, including performance improvements for speech recognition and facial expression analysis. At ICT we are in particular interested to advance the state of the art in deep neural networks and machine learning approaches that allow us to learn multimodal representations of human behavior. We will use these representations to assess an individual’s well-being and affective state.

Job Description
The candidate has experience with machine learning approaches and is comfortable in programming in Python. The candidate will participate in machine learning experiments that aim to better predict a person’s psychological well-being, e.g. depression recognition. We have access to the largest dataset of depression screening interviews and will leverage big data resources to train successful models. A big plus is if the candidate has experience with deep learning toolboxes such as Tensorflow, Theano, or Keras.

Preferred Skills

  • Python
  • Machine Learning
  • Linux
  • Deep Learning Toolboxes

Apply now.

Go back to the summer research program list.