I received my MS in Computer Science from Virginia Tech, Blacksburg, in Summer 2020, where I was co-advised by Professors Dr. Chris North and Dr. Nicholas Polys. During my MS, I worked as a teaching and research assistant. My MS thesis entitled “Immersive Space to Think: the Role of 3D Immersive Space in Sensemaking of Textual Data” earned a publication in the 4th Workshop on Immersive Analytics at ACM CHI 2020.

During my graduate studies, I had the opportunity to intern at several research labs, including UX Lab, Informatica in California, and Visual Analytics Lab, UPS in Atlanta.

Prior to my MS, I worked as a research assistant at the Helsinki Institute for Information Technology, Finland. I have been fortunate to learn from and collaborate with several researchers throughout my career, leading to scientific peer-reviewed publications in premier venues such as ACM CHI and UIST. Please see my Google Scholar page for more details.

My research interests lies in human computer interaction, data visualization, immersive analytics, visual analytics, machine learning.

Research Experience

I have been fortunate enough to work in various projects in my research career. Below are a brief description of those projects.

Immersive Space to Think (IST): Combining Virtual Reality and Analytics for Improved Sensemaking

MindSee logo

In this research, we explore more into immersive Virtual Reality (VR) for visual analytics. We propose a novel immersive system, in which analysts can organize data, generate hypotheses and form a story. We call this novel system: Immersive Space to Think (IST). We hypothesize that immersive VR can be used to provide a more expressive, expansive space to think during analytic synthesis, and that immersive systems offer new opportunities for semantic interaction to guide machine learning algorithms and improve analytic outcomes. I am using iterative design approach to develop the system. Additionally, am collecting and analyzing user behaviour (in making sense of data inside VR) through lab studies, think--aloud protocol, server logs and interviews.

[Project] [Paper]
Go to top

Tracking user data in android devices for recommending user activity

Android logo

Worked with Dr. Sasu Tarkoma (Professor) and Dr. Mohammad Hoque on Big Data and Internet of Things (IoT) to develop an application (using cross platform integration of GoLang with Android) for distributed processing of large data sets like image search and retrieval. The work was funded by Cubic project from Academy of Finland.

Go to top

MindSee EU project

MindSee logo

In this project, I was a part of the MindSee project which aims to develop an information seeking application that exemplifies the fruitful symbiosis of modern Brain Computer Interface technology with real-world Human Computer Interaction. The result will be a cutting-edge information retrieval system that outperforms state-of-the-art tools by more than doubling the performance of information seeking in realistic tasks. This project is a joint collaboration between leading universities of world. I integrated the client-server communication in Openframeworks; upgraded the server to integrate with any client version; carried out user study with real users for tracking people's gaze interaction for searching scientific articles using an eye tracker integrated with the search system; extracted collected data and analysed them. The work was funded by MindSee EU project.

Go to top

Cross system recommendation

CrossSystem logo

In this project, I worked with Dr. Giulio Jacucci (Professor) from University of Helsinki, Dr. Tuukka Ruotsalo and Dr. Jaakko Peltonen from Aalto University. This was a part of Helsinki Institute for Information Technology (HIIT). We collaborated with School of Information Sciences, University of Pittsburgh to work with Dr. Peter Brusilovsky (Professor) and his doctoral student Chirayu Wongchokprasitti. I was responsible for carrying out the massive scale user-study with 24 users using both source and target systems, preparing necessary files and documentation for user study. Each study was done for 3 to 4 hours depending on the time each user took to complete the two target tasks. In this study, task completion time was not a barrier. I was also responsible for implementing client-side data logging of our research prototype using HTML5 localstorage. I also implemented auto storing of the JSON data in appropriate files in the server. I was also a part of data analysis of the collected data. This project was both part of and funded by HIIT Wide Focus Area research.

We got our full paper accepted in User Modeling, Adaptation, and Personalization (UMAP) 2015.
[UMAP (2015) paper] Go to top

Navigating complex information spaces: A portfolio theory approach

Thesis logo

In this project, I worked under the supervision of Dr. Antti Ukkonen, Dr. Tuukka Ruotsalo from Aalto University and Dr. Giulio Jacucci (Professor) from University of Helsinki. The way users try to find information is an interesting topic of interest in the field of Human Computer Interaction and Information Retrieval. Previous studies have shown that users prefer to find information beyond query. With the ever increasing information in the web, users are often lost in the vast information space. This project concentrates on navigation task rather than search task. Navigating in the complex information space to find the required information, is often an abstruse task by users. One of the reasons is the difficulty in designing systems that would present the user with an optimal set of navigation options to support varying information needs. As a solution to the navigation problem, a method has been proposed which would ease users' navigation problem. The proposed method named as Interaction Portfolio Theory, is based on Markowitz's Modern Portfolio theory, a theory of finance. The theory provides the users with N optimal interaction options in each iteration, by taking into account user's goal expressed via interaction during the task, but also the risk related to a potentially suboptimal choice made by the user. In each iteration, the proposed method learns the relevant interaction options from user behaviour interactively and optimizes relevance and diversity to allow the user to accomplish the task in a shorter interaction sequence. I was funded by MindSee EU project.

[Symbiotic (2014) Paper]
Go to top

Seismic reflection: seismic automated processing development

Seismology logo

I was fortunate enough to spend my Summer 2013 as research assistant in Institute of Seismology, University of Helsinki. I worked under the supervision of Dr. Timo Tiira (Seismologist), University of Helsinki. I developed and investigated methods for different scales of similar structures in the automatic identification using pattern recognition, self-learning systems, and image processing methods. The existing software was written in ansiC, I redesinged the software using OpenCV image processing library. My work was jointly funded by Department of Computer Science, University of Helsinki and Institute of Seismology, University of Helsinki.

Go to top

Academic projects

Data Analysis Challenge: Information Visualization

TeraGrid logo

In this project, we solved a cyber security threat mystery and visualized the final analysis. Our task was to identify the 5 W's: who, what, when, where, why. Who were the malicious actor(s), what did they do, when and where did they do it, and, if possible, why did they do it? We used visualization to explore the data and identify initial hypotheses. One interesting challenge was to connect the different data sets.

Team Members: Payel Bandyopadhyay, Min Oh, Anika Tabassum
Grade: A/A
Course Mentor: [Dr. Chris North]
[report] |
Go to top

Comparison of Absolute and Relative Pointing Effectiveness using Leap Motion

TeraGrid logo

This project shows an application that can be used as a research tool to compare the ease of use between absolute pointing and relative pointing using Leap Motion Controller. It captures the quantifiable data in order to assess the effectiveness of different pointing techniques using a Leap Motion Controller. The application consists of four colored circles and four target circles. The user needs to select the colored circles one at a time and drag it to the respective target circles. The same task has to be done for both the pointing modes. For comparing the ease of use, we display the time a single user takes to complete the whole task, in each pointing mode.

Team Members: Payel Bandyopadhyay, Chris Blythe, Afaque Hussain, Farbod Faghihi
Grade: 5/5
Course Mentor: [Dr. Antti Jylha]
[report] | [presentation]
Go to top

Rubber hand illusion in mixed reality environment

Interactive logo

This project develops a state-of-art interactive system using interaction paradigms: from desktop, to mobile, and to ubiquitous computing. This project shows an application that can be used as a research tool to investigate the impact of combining virtual reality and information from the real world within a mixed reality environment. It simulates traditional rubber hand illusion in a mixed reality environment. In this mixed reality environment, the user sees the fake hand in a virtual environment (for which Oculus Rift has been used) and the user's one of the real hand is used to fool the user's brain. The fake hand in virtual environment is the hand which remains behind the cardboard in traditional rubber hand illusion. This has been done so that the user believes that the fake hand in the virtual environment is his/her own hand. The simulation, the user's view point and observed stress response of the user to the threat of the rubber hand illusion was measured by questionnaires (provided after the experiment) and analysed via video recordings done during user experiment.

Team Members: Payel Bandyopadhyay
Grade: 5/5
Course Mentor: [Dr. Eve Hoggan]
[report] | [presentation]
Go to top

Analysis of Four Usability Evaluation Methods Applied to Augmented Reality Applications

Augment logo

The way users interact with computers/mobile devices is changing drastically with the new emerging technologies. Augmented Reality (AR) is one of the new technologies which defines a new way of user interaction. There has been a large amount of research work done in evaluating user interfaces of traditional systems such as mobile devices and web interfaces. Since AR is one of the new emerging technologies, the number of systematic evaluations done in AR interfaces is relatively low. In this project, a systematic evaluation of the user interface of an AR application has been done. Out of the existing usability evaluation methods, four methods have been chosen as a guideline to evaluate the targeted AR application. In order to cover all the aspects of usability methods, two methods from usability inspection (Cognitive walkthrough and Heuristic evaluation), one from usability testing (Laboratory observation) and one from user reports (Questionnaire) have been chosen. The AR application that has been evaluated in this project is "Augment - 3D Augmented Reality". The results obtained from the four usability methods have been described in the report (see the link below). Usually, due to limited time and resources, applying all the methods to evaluate an user interface is not feasible. Hence, a comparison of the results of the four usability methods has been carried out. In this comparison, a justification, based on the results obtained, about which usability evaluation method would be more suitable in case of AR interfaces is presented. Finally, a set of design guidelines for AR applications has been proposed.

Team Members: Payel Bandyopadhyay, Hector Martinez (PhD: 30 Jan 2015)
Grade: 5/5
Achievement: Scored highest marks

Course Mentor: [Dr. Elisa Schaeffer (Professor)]
[report] | [presentation]
Go to top

Mockup app based on sensor data (map of friends' positions)

Android logo

This project shows a client-server based developed mobile application that shows/tracks the location of the mobile device and the sensor readings on Google maps. The information is fetched from the CoAP server using HTTP protocol. Jcoap library was used as it already contains a simple in-memory-database, and a http server. The application needs to be installed in two or more android devices. The client android application will send all sensor data along with location information to the server and the server will push registered devices' location information to all client devices. With the help of the location information a client device can monitor other client's movement in Google Map.

Team Members: Payel Bandyopadhyay
Grade: 5/5
Course Mentor: [Dr. Sasu Tarkoma (Professor)]
Go to top

Understanding the basic concepts of human computer interaction

Hci logo

This project provides practical implementation of principles of human computer interaction. It helps to learn and understand the key concepts of basic modalities from a psychological, ergonomic and technical point of view. The end goal was to develop journey planner based paper prototype for helping old people (around age 65 and above). The target users were senior people who are not so well versed in using mobile or desktop. Various real-life interviews were conducted with senior people for understanding user needs. The interface design of the paper prototype (developed in end) uses methods and principles presenting graphical user interfaces and direct manipulation, menus, navigation, up to multimodal interfaces. User modeling was done through cognitive, experiential and social models of users and task analysis culminating in computational models of users to be included in interactive systems.

Team Members: Payel Bandyopadhyay
Grade: 4/5
Course Mentor: [Dr. Giulio Jacucci (Professor)]
[report1] | [report2]
Go to top



  • Navigating Complex Information Spaces: A Portfolio Theory Approach
    Payel Bandyopadhyay
    Jan 29, 2015 | HCI Seminar, Aalto University, Finland.
  • Go to top

Awards & Honours

  • Full scholarship to attend CRA-W (held in Chicago, IL) from grad cohort, April 2019
  • Full scholarship to attend CRA-W (held in San Francisco, CA) from grad cohort, April 2018
  • Full scholarship to attend Grace Hopper Conference, August 2016 (held in Houston, TX)
  • Gouranga Das Overseas Travel Award, August 2012
  • Gold Medal for ranking 1st out of 60 students in undergraduate Department, December 2012

  • Go to top


  • CS 5254  Mobile Application Development,   Virginia Tech, Blacksburg, USA
    Teaching Assistant
    Spring 2018, 2020
  • CS 4114  Introduction to Formal Languages and Automata Theory,   Virginia Tech, Blacksburg, USA
    Teaching Assistant
    Spring 2019, Fall 2019
  • CS 3744  Intro GUI Programming/Graphics,   Virginia Tech, Blacksburg, USA
    Teaching Assistant
    Fall 2017
  • 57596   Guidance tutoring,   University of Helsinki
    Teaching Assistant
    Fall 2013, Spring 2014, Fall 2014
  • International Tutor,   University of Helsinki
    Fall 2013
  • Go to top


  • I am a trained classical dancer. I have completed fifth year in "Kathyak" (classical dance form in India) with first class distinction from Bangiya Sangeet Parishad.
    [pic1] | [pic2]
  • I am very passionate about painting. I have completed sixth year in painting with first class distinction (specializing in oil colors) from Bangiya Sangeet Parishad.
  • I am also a trained swimmer. I love swimming in weekends.
  • I have won several competitions during my school life.
  • Go to top