When you consider that our world creates 1.7 million bytes of data every minute, roughly the equivalent of 360,000 DVDs, you can understand why many people might have trouble wrapping their head around big data. However, thanks to the Collective Experience of Empathic Data Systems (CEEDs) project, just about anyone can “step inside” a large data set through virtual reality.

As featured by Forbes, CEEDs is a European consortium of 16 partners in nine countries that has developed a way to virtually present big data in more relatable, human terms. Using an immersive multi-modal environment called the eXperience Induction Machine (XIM), CEEDs scientists at Pompeu Fabra University in Barcelona can now allow users to step inside large data sets where its scope is presented not in numbers, but in accessible, graphical terms.

Wearing a virtual reality headset inside the XIM, a user can view datasets presented in various shapes and forms. The difference, however, is that the CEEDs user will see the presentation change based on his reaction, either conscious or unconscious.

To sense these reactions, the XIM uses a variety of wearable technology devices, such as motion sensors to track postures and body movements, a glove that senses hand movements, grip and skin response, voice equipment to detect emotion in the users’ statements or utterances and facial analysis to monitor expression, pupil dilation and more. By monitoring all the user’s reactions to the data presentation, the XIM can sense fatigue or information overload and simplify the visualization to lighten the cognitive load or simply steer the user to less data-intensive areas.

As CEEDs is a consortium of 16 partners, its early uses have covered a wide range of specialized fields. For example, researchers at Leiden University are using the XIM to enhance the virtual reconstruction of ancient Greek cities, while German scholars have been using the technology to virtually recreate the history of a Nazi concentration camp at the Bergen-Belsen memorial site. In addition, a number of museums in the Netherlands, the United Kingdom and the United States are considering use of the XIM as part of their commemoration of the 70th anniversary of the end of World War II in 2015.

Looking forward, CEEDs researchers say the XIM has big potential anywhere there’s a big dataset to be analyzed. That can range from creation of a virtual retail store environment in an international airport to the visualization of African soil quality and climate to help local farmers optimize crop yield.

Using virtual reality, a picture really can be worth a thousand words. And, thanks to the CEEDs project, now anyone can see the big picture on big data.

Photo and video credit: CEEDs

MARKET RESEARCH x INDUSTRY TRENDS

TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. Stay ahead with of the industry with charts, figures, and insights from our unparalleled network, including executives from Facebook, Google, Baidu, Yahoo!, MIT, Stanford and beyond: