top of page

BRAIN CONNECTIVITY LEAP

THE BRAIN VIRTUAL REALITY NAVIGATOR (LEAP MOTION + OCULUS RIFT + UNITY3D)
Have you have wondered what could be to navigate inside your brain?
 
We have, so we did it!
 
Try it also yourself with the:
Brain Connectivity Leap

This is the Brain Connectivity Leap, an app developed by Filipe Rodrigues, Ricardo Ribeiro and Hugo Alexandre Ferreira within the scope of the Fundação para a Ciência e Tecnologia grant PTDC/SAU-ENB/120718/2010   

 

This app was developed in Unity3D and makes use of magnetic resonance imaging (MRI) data, Oculus Rift to provide an immersive environment and the Leap Motion controller to enable navigation and interaction with the virtual brain. MRI data was processed using the Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox. This toolbox pipelines different neuroimaging software such as Freesurfer that here was used to parcelate 3D T1-weighted MRI data. Additionally, Diffusion Toolkit pipelined in MIBCA was used to process Diffusion Tensor Imaging (DTI) data and estimate tracts. MIBCA then computed the number of tracts between every brain region pairs resulting in a structural connectivity matrix. This matrix is here represented as a brain or anatomical graph. The thickness of the edges in the graph is proportional to the number of tracts connecting each two brain regions. Both anatomical, structural, functional and effective connectivity data derived from T1-weighted, DTI, functional MRI and positron emission tomography (PET) data can be used. The application is evolving to include different brain parcellations/atlas, tractography and functional maps and easier data importation. Also we are developing a device to further bost interactivity!

CARDIMAX FEELS YOUR MOOD!
 
LINK TO AN AMAZING CODE ...

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

Here is Cardimax, a fun device that feels your mood and translates it into light and music!

 

Developed by Ana Rita Lopes, Inês Palma e Cruz, and Joana Dias for the Medical Robotics class 2014-2015. It makes use of the Bitalino board with a pulse sensor placed at the ear lobe. The oxymetry signals are acquired and processed using a Python application which then modulates de voltage of two 2 RGB leds placed inside the "CardiMax" via an Arduino Pro Mini and plays a tune according to the user's heart rate. Cold colours and slower paced tunes correspond to slower heartbeats, whilst warm colours and faster paced  tunes correspond to faster heartbeats.

 

Really fun!

  

BRAIN TUNES
 
DO THEY HELP YOU STUDY?

Here is the demo app that Carlos Duarte, João Antunes, and Tiago Fernandes developed for the Neurosciences class 2013-2014.

 

This demo app made use of the Neurosky headset in order to measure the brainwaves of the user (prefrontal cortex). The app was developed in Android and communication between the smartphone and the Neurosky headset was made via bluetooth. The goal of the app is to help the user remain concentrated while studying by playing a tune of his/her liking. Here we see the user engaged in reading: when his concentration levels reach a pre-defined threshold the smartphone screen changes color and it starts playing music! 

 

Would it work for you?

  

PHYSIOLOGICAL COMPUTING GAMES
 
CHALLENGE YOUR BIOSIGNALS TO A MATCH!

At the time, the Euro 2012 Cup was ongoing, and polish students Adam Chec and Domika Olczak challenged portuguese student Tiago Fernandes (yes, the same guy as above) for a match. But not for a conventional match!

 

Tiago was wearing an home-made belt with electrolycra based sensors for reading the electrical activity of his heart (so called electrocardiography - ECG). These biosignals were acquired with the Bitalino and feed to videogame developed in Unity3D: The ECG Street Soccer game. Prior to Unity though, signals were processed using a Python application which filtered the signals and computed the heart rate. The gameplay was then as follows: if Tiago's heart rate was high then the speed of his player was faster, but at the same time his stamina was progressively lower, and the player started to getting tired and moved slowly inspite of a fast heart rate. So Tiago had to make a compromise on how long he could keep a fast beating heart! Challenging, uh?

 

If I recall correctly, Tiago kept loosing against the computer ;)

 

This work was done in the Medical Robotics class and subsequently Adam did his research internship with me and his Bachelor thesis on the topic. He further developed a simpler application with the goal of moving towards Android. We ended up developing the FlappyHeartPC game (here PC stands for Physiological Computing). Check it at:

 

FLAPPYHEARTPC

 

We also have a paper and a poster on this work!

  

ROBOTIC GLOVE
 
A FIRST STEP TOWARDS ACCESSIBLE NEUROREHABILITATION

Medical Robotics class 2012-2013 students: Ana Rita Soares, Carolina Carvalho, Carolina Vale, Francisco Malheiro, Inês Santos and João Agostinho jointly develop a quite simple, low-cost and surprisingly effective robotic glove.

 

The glove started out as a regular sports glove, to which lollipop stick were glued (you read well, lollipop sticks!) to both the dorsal and palmar sides of the glove. Then, using fishing (nylon) line through the sticks (they are hollow) the finger tips of the five fingers were connect to two bidirectional stepper-motors: the dorsal lines were connected to one the motors and the palmar lines to the other one. In this manner it was possible to control hand opening (extension) and closing (flexion). Additionally, two button sensors were placed in the glove: one in dorsal side of the proximal phalanx of the second finger (the index); and the other in palmar side of the distal phalanx of the first finger (the thumb). These sensors signaled the motors when to stop due to complete hand extension or flexion, respectively. Everything was controlled using an Arduino Uno and it played quite well, as the opening and closing felt as "natural" (i.e. not awkward finger movements due to the lines). 

 

Presently the robotic glove is being used in a larger (neuro)-rehabilitation platform.

  

MIND CONTROLLED LEGO
 
LEGO, THE MOVIE, CAN FINALLY HAPPEN IN REAL LIFE!

During is 3rd year bachelor degree internship Diogo Duarte developed a BCI application using the Neurosky headband. Here, brainwave patterns were decoded to control the movement of a LEGO Mindstorm NXT robot. The "mind-control" of the robot was tested by "ordering" it to move around obstacles, which it did successfully!

 

This was a simple and low-cost way of demonstrating brainwave control of motorized devices, opening perspectives that these systems may one they be widespread among patients with motor impairments.

 

Both a communication and a conference proceeding paper resulted from this work. Preatty great work from un undergrad!

  

PROJECT ZHAYEDAN

SHORT FILM ABOUT A DYSTOPIAN FUTURE, IN WHICH TISSUE AND ORGAN ENGINEERING TECHNOLOGY GOES ASTRAY
(IN PORTUGUESE)

///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

TISSUE AND ORGAN ENGINEERING IS THE FUTURE. BUT LIKE ALL TECHNOLOGY, IT MAY BE SUBVERTED.
 
What could go wrong?
 
Find out at 
PROJECT ZHAYEDAN

Short film (30 min) made by my students of Tissue Engineering and Artificial Organs of the integrated master course on Biomedical Engineering and Biophysics of the Faculty of Sciences of the University of Lisbon (class of 2012-2014).

 

Credits

 

Direction

Tiago Ferro

 

Screenplay

André Girão

João Martins

Mariana Trincão

Patrícia Zoio

Sónia Ferreira

Tânia Roque

 

Scientific Consultants

João Periquito

Tiago Ferro

Tiago Silva

 

Image Edition

Carla Semedo

Catarina Freitas

Neuza Silva

Rui Teixeira

Tiago Silva

Yiyi Ji

 

Sound Edition

Ana Rita Rocha

Andreia Cândido e Silva

Gil Braz

Marina Costa

Melissa Sirage

 

Actors

André Baião - Scientist

Sara Reis - Company's President

Patrícia Zoio - Company's President Daughter

Rui Teixeira - Trainee

João Fernandes - Repo Man

Guilherme Oliveira - TECER

Ana Lourenço - Humanoid

Gil Braz - Security Officer

Gonçalo Condeço - Jarvis (Voice)

Andreia Cândido e Silva - Sister 1

Melissa Sirage - Sister 2

Tiago Ferro - Physician

Sónia Ferreira - Physician

Andreia Capaz Silva - Marketeer 1

Catarina Freitas - Marketeer 2

 

Nanotechnologies in Biomedicine students Joana Brito, Nivaldo Pereira and Vânia Tavares have adapted and translated to portuguese the inspirational work entitled "Jell-O Chips" by authors Yang, Ouellet and Lagally. This work aims to provide students from earlier ages and the general population alike with concepts related to Nanotechnology and Microfluidics in particular, by exploring a  learn-by-doing and fun apprach using everyday products.

 

Inclusively they have prepared a presentation and a guide, carefully explaining all the preparation, all the experiments and related concepts.

 

In the video above the setup of the Y-channel and the laminar flow inside the channel is shown.

 

Enjoy!

 

  

  

GELATIN CHIPS
 
A SIMPLE AND PRACTICAL WAY TO LEARN MICROFLUIDICS

Luís Atalaya took the challenge seriously and developed a fantastic physiological computing game called "EXTREME ZEN", a retro-looking platform game which involves laser guns, aliens and meteorites!

 

In this game the player has the choice to go right (ZEN) or left (EXTREME). The issue is, the game depends on your heart rate! if the heart rate is fast then the EXTREME side of the game is easier otherwise if the heart rate is slow then the ZEN side will be easier. The fun is, you really need to learn how to control your heart rate in order to have the best performance.

 

EXTREME ZEN is powered by Bitalino, an open-source biosignal acquisition platform which reads electrocardiography (ECG) signals measured in the fingers palms using electrolycra sensors. Signals are then feeds into to the game, developed in Unity3D.

 

Nowadays, Luís is finishing is master thesis in Bioinformatics under a joint supervision of Emanuel Santos, a colleague from LASIGE FCUL and myself. The final goal is to make the game work in a tablet!

 

  

PHYSIOLOGICAL COMPUTING GAMES PART II
 
HEART ARCADE ! !

José Soeiro developed a great Mixed Reality app for brain visualization! This was the topic of his master thesis in informatics which was a joint co-supervision with BIOISI/MAS (Agents and Systems Modelling) FCUL colleagues Ana Paula Cláudio and Maria Beatriz Carmo and myself.

 

This app was developed in the Metaio environment for usage in an Android smartphone. It encompasses both Augmented Reality (AR) and Virtual Reality (VR) functionalities, including depiction of instruments and selective brain regions.

 

The correct superimposition of a digiral image with a real display was achieved using QR codes placed on a subjects' head (here a mannequin), and the complexity of the displayed image depends on the performance of the Android platform used.

 

The goal of this app is to help physicians localize and navigate the brain, either for biopsy, surgical planning or application of Transcranial Magnetic Stimulation (TMS).

 

Usability tests by physicians were quite encoraging!

  

BRAIN AR/VR
 
THE NEXT EVOLUTION ON BRAIN VISUALIZATION
bottom of page