ESPCI internship

download                                                                         logo_unige_gray.jpg

We currently have 3 openings for internships for ESPCI students at the University of Geneva:

Contact: Daniel.Huber(a)

Marker-less 3D forelimb tracking for mice

Reaching out for objects and manipulating them are basic gestures governing most of what we do. Understanding the underlying neural circuits is not only a fundamental question in neuroscience, but also has important implications for medical conditions where motor function is affected, such as after a stroke or in Parkinson’s Disease.

Mice share many aspects of reach kinematics with humans and are therefore considered an excellent animal model to study motor function. Mice can easily be trained to reach for small rewards, while neuronal activity is simultaneously recorded in different brain areas. However, tracking the fast moving, small and furry forelimbs is an extremely challenging task. Reflective markers, such as the ones used on humans, are difficult to apply to mice, since they would perturb their natural movement or be rapidly removed by grooming. For marker-less tracking, manual frame-by-frame video analysis is most commonly used to determine the limb position and to classify the reaching phases. This approach is extremely time consuming and inefficient.

The goal of this project is to develop an automatized system for high-speed videos to track and quantify the forelimb movements of mice. It will be constrained by a 3D realistic musculoskeletal model of the forelimb based on X-rays imaging, which will serve as the ground truth for simultaneously recorded infra-red high-speed videos (60 to 120 fps). The application should be robust and flexible enough to predict the position of the forepaw even under low contrast illumination or during partial occlusions. This project will be carried out at the University of Geneva, in close interaction with other lab members.

claudiareach    ezgif-com-crop

Specific Goals:

  1. Create a realistic 3D multi-joint model of the mouse forelimb (shoulder, elbow and paw) based on X-ray imaging.
  2. Develop an application for automatic detection and tracking of the mouse forelimb during goal-directed reaches based on high-speed infra-red imaging.
  3. Adapt the system for online tracking and closed loop applications.

Requirements: Strong programming skills, keen interest in neuroscience and animal behavior, team player.

Techniques: High-speed videography, X-ray movement analysis, machine learning, computer vision, Kalman filter.

Duration: 3-4 months

Contact: Daniel.Huber(a)

Closed-loop force-field joystick for mice

One of the surprising features of mammalian motor control is the ability to plan and control goal directed movements independently of the initial position or of external forces. This ability allows us, for example, to drink a cup of coffee in any position and independently of whether it is full or almost empty. To uncover the related neuronal mechanisms, motorized manipulanda such as joysticks can be used. They allow creating illusions of springs, weights or obstacles by generating arbitrary forces in any directions, while simultaneously measuring the kinematics and forces applied by the subject. Such devices are also important tools used for the rehabilitation of patients after injury of the motor system.

In the laboratory, we use mice to study the neuronal circuits involved ijoystickn motor control. Mice are very dexterous with their forelimbs and share many kinematic features with humans. We routinely record and manipulate brain activity while mice learn to move miniature joysticks. The direction of the movement is indicated with different sensory stimuli and mice learn this association rapidly. Currently our joysticks are restricted to two directions and are simply spring loaded, which limits the types of experiments.

In the proposed project, the candidate will develop a miniature robotic joystick for applying force fields to guide or perturb mouse forepaw movements.

  1. Design a mouse-sized robotic joystick comprising of:
    1. Online measurement of movement kinematics (e.g. optical encoders, load cells, force sensitive resistors etc.)
    2. Actuating system for applying force perturbations (e.g. linear motors, electromagnetic motors etc.)
  2. Program a real-time processing routine and a GUI allowing for users to input the force field characteristics. Data acquisition hardware (NI or Arduino) programming preferably in Matlab or Python will be required.

Requirements: Excellent engineering skills (mechanics and electronics), strong programming background, highly motivated, keen interest in biology and neuroscience.

Techniques: Closed loop control, load cells, linear motors, robotics.

Duration: 3-4 months

Contact: Daniel.Huber(a)

Video games for small primates

In systems neuroscience research we try to relate neuronal activity with behavior. This is usually achieved by designing behavior in which the subject (humans or animals) solves a given task under controlled laboratory conditions, outside their natural environment. Many tasks in neuroscience are based on visual cues such as sequences of images, fixed patterns or brief video scenes. However, instead of bringing the subject into the laboratory, why not bringing the lab to the subject? Actually, with the use of smartphones, this is already a reality. We currently use every free minute to solve visually guided tasks (video games, maps, instructions), which were previously solved at a desk or in the laboratory setting.


This project aims to develop a robust behavior system which can be mounted into the natural habitat of small primates and allow them to access it at any moment of the day to perform a visually guided behavioral task where they will collect food and rewards and during which their brain activity can be monitored. In addition, we plan to measure body parameters, such as eye position, heartbeat, and position in the freely moving subjects. This project will be carried in Geneva and Paris in close collaboration with lab members.

  1. Consolidate an existing behavioral control system to the use in the wild (miniaturizing, waterproofing, energy supply).
  2. Implementation of small microprocessors (arduino, rasberry pi) or opensource smartphones for the presentation of complex visual stimuli.
  3. Integration of additional miniaturized, portable hardware (head-fixed eye tracking cameras, accelerometers).

Requirements: Excellent engineering skills (mechanics and electronics), strong programming background, highly motivated, keen interest in biology and neuroscience.

Techniques: Hardware design, behavioral analysis.

Duration: 3-4 months

Contact: Daniel.Huber(a)