Masterclass – Multimodal Engagements with Cultural Heritage

The research institute in the Humanities at Maynooth University has organised a Masterclass on ‘Multimodal Engagements with Cultural Heritage‘. This 3-day Masterclass was designed to introduce participants to methods of producing and re-using cultural heritage. During three days, participants will learn how to convert physical objects to digital and then back to physical through 3D printing techniques. Second, we will re-embed some of those physical objects with digital information.

digitohsNO.png

I will be leading the second part where we will design a Tangible User Interface to query data from Europeana. We will be using physical objects and embed them with interactive properties to perform queries in Europeana’s repositories.

 

The interactive system that we are going to build combines a wide range of technologies such as TUIO/ReacTIVision to connect the physical objects to the computer and use them to produce queries through Europeana’s API using JQuery and JavaScript.

TUIO Table

You can see the interface working on my Youtube channel.

 

Learning Objectives

Day1. The Semantic Web and Linked Data

The first part of the Masterclass will introduce basic concepts of how current Web technologies such as the Semantic Web are being used to enhance the quality of the information in cultural heritage organisations.

image09

Part 1. Foundations of Semantic Web and the Europeana Data Model

Part 2. Europeana API

Day 2. Tangible Interaction on the Web

Europeana’s data is very complex and extensive. By understanding how the data model uses the different semantic concepts to conceptualise the information, those semantic relationships and data fields can be used to query and visualise according to our user needs.

colo

Part 3. Visualising Europeana Data

Extending Visualisations with JQuery UI

Part 4. Tangible Interaction on the Web

 

Through this Masterclass, we will also work with participatory design principles to explore what particular behaviours users might have when approaching this type of content. The main objective of this second part of the Masterclass is to re-think how  we can interact with Cultural Heritage on the Web and how those interactions might take place.

 

 

 

 

 

 

 

 

Advertisements

Europeana TUIO – Final Build!

We have been building the web application through a wide range of services and libraries. We used JQuery and JQuery UI to change the way the different HTML objects react and look on the interface. We used the TUIO protocol to translate the data sent from the sensor (in this case is a webcam), into a JavaScript usable protocol. To make the library of actions, we used nptTuioClient and its plugin and attached several functions to decide what is the interface meant to do whenever a fiducial enter, moves or leaves the active area.

The current object used to search, attaches the query syntax to the API call to Europeana, and by rotating it, we can change the term of the particular dataField that we want to reference.

The list of fiducials being used are:

fiddy


Using the printed fiducials.

Since we have been using the emulator, we haven’t had the chance to see how the interface reacts when we use the printed markers. Once we try it, we will finalise the fiducial or pyfos and make finalised isometric shapes as volumes to build with paper and make them graspable and manipulable.

To do this, we need to use the reacTIVision toolkit and use the fiducial tracker instead of the emulator. The link can be found here:

http://reactivision.sourceforge.net/#files

Depending on your operating system, you will have to download and use the reactivision application inside.

In addition, reacTIVision has a large number of fiducials and each one has their own id. Here is the link to the PDF file.

http://reactivision.sourceforge.net/data/fiducials.pdf

image009


Running reacTIVision

When you open reacTIVison, the software should just recognise the video device that you have currently available in your computer. Nevertheless, you might have more than one device plugged and you might want to trigger it to that specific device.

bullet 1. Open reacTIVIsion

If your camera is detected and you can see the display. Then skip to the next section.

rac

If your camera was not detected, reacTIVision will give you a notification and close

disco

To fix this, follow these steps:

bullet 1.a. Open the reacTIVision folder and then the calibration folder. Inside it run the list_devices application. Once opened, it will give you a list of the different devices and the number to identify it. REMEMBER THIS NUMBER!

In this example, the camera to be used is number 2 (USB Camera).

reaci.jpg

bullet 1.b. Close the list_devices app by clicking in the ok button.

bullet 1.c. Go back to the reacTIVision folder and right-click the reacTIVision app and select Show Package Contents

reaci2

This is going to open a new browser window.

bullet 1.d. Open the folder Resources, and then the file camera.xml

bullet 1.e. In camera.xml, change <camera id=”Number”> for whatever number your camera was listed with. In this example it is 2. 

raci3

bullet 1.f. Save the file, close the file explorer and open the reacTIVIsion application once again.


Viewing the final result

You should see the objects on the interface reacting in the same way as they did when we used the emulator.

fini

If you move the pyfo and the #mySearch box moves to the opposite side, we can press i to flip the x or y axis of reacTIVision so it fits to our browser position.

commy

bullet 2. Press to see the different options

If the camera is having trouble with the light or detecting the markers we can open the camera options to fix it.

bullet 3. Press to open the camera options and change the calibration.

and]


Congratulations!

Here is the the final prototype working! Don’t forget to share and promote Tangible User Interfaces! 🙂