Niklas Elmqvist is a Swedish-American computer scientist. He is currently a professor in the Department of Computer Science[1] at Aarhus University, and a Villium Investigator.[2] He is the Director of the Center for Anytime Anywhere Analytics[3] at Aarhus University, a research center on augmented reality and extended reality (AR/XR) for data visualization.
Quick Facts Born, Education ...
Close
Previously, he was a professor in the College of Information Studies,[4] an affiliate professor in the Computer Science Department, and an affiliate member of UMIACS (University of Maryland Institute for Advanced Computer Studies), all at the University of Maryland, College Park.
Elmqvist served as director of the University of Maryland Human–Computer Interaction Lab from 2016-2021.[5]
Prior to joining UMD, he was a faculty member in the School of Electrical and Computer Engineering at Purdue University from 2008 to 2014.
In 2013, he received a U.S. National Science Foundation CAREER Award.
In 2018 he was named a Distinguished Scientist by the Association for Computing Machinery (ACM).[6]
In April 2023, he was named a Villum Investigator by the Villum Foundation in Denmark.[7]
In November 2023, he was named as an IEEE Fellow by the Institute of Electrical and Electronics Engineers.
Niklas Elmqvist is known for his work on human–computer interaction, data and information visualization, and visual analytics.
His contributions are diverse and focused on innovative human-data interaction and open data infrastructures.
Early work centered on multivariate data visualization, such as in the GraphDice[9] or DataMeadow[10] systems.
In his 2009 paper "Hierarchical Aggregation for Information Visualization"[11] he proposed a model for building, visualizing, and interacting with multiscale representations, guiding other researchers toward more scalable visualization techniques.
Niklas Elmqvist has developed new architectures that enable novel combinations of multiple devices (such as smartwatches and large displays) as well as thought-provoking new devices, such as the first olfactory data display.
He has championed the design of open and standardized infrastructures that support meshing those devices into a coherent whole, with a series of prototypes such as Munin, PolyChrome, and, more recently, the promising Vistrates framework,[12] that supports easily building cross-device and distributed visualization applications.
His research exemplifies fluid interaction in a diverse set of visualization topics, from graphs and time series to animation and games, and, more recently, sensory substitution mechanisms such as haptic technology and sonification to improve the accessibility of visualizations for users with visual impairments.[13]