Shakespeare’s Hunt (Video)

Shakespeare’s Hunt is a pilot of an alternate reality (also known as treasure hunt) iPad game for the exhibition of Shakespeare’s Globe Theatre. I worked on the project as a partner of Metavore London, developing the software, designing the interface and contributing to its game design aspect. My contribution to this project is also my thesis for the second year of my MFA studies.

In the beginning of the last academic year I had as a personal goal, for my final project to work closely with a cultural institution and develop a piece of work that would serve a purpose set by the professionals who work there. Given that last year my prototype applications were designed to fulfil needs that I had only come across through my research, I got the feedback that some assumptions of mine don’t stand for the majority of the cultural heritage institutions. In particular, during a conference earlier in the year I had the chance to discuss with Nancy Proctor, head of mobile strategy for the Smithsonian Institution,  and commenting on my work, she made the point that some of the material ARS:CI is aiming to take advantage of, can be provided only by a small minority of museums that can afford it e.g. under-drawings of paintings, or 3D reconstructions of damaged sculptures.

For Shakespeare’s Hunt we had the chance to work closely with people from Shakespeare’s Globe, hence it was a great pleasure for me to have their insight and be influenced during the design phase of the project by the objectives that the Globe itself follows when it comes to matters such as the use of new technologies in exhibition spaces and education.

For my studies I have been writing a document about this project, which I am planning to circulate to a few conferences on the topic and hopefully get it published. I am still working on it, however the synopsis as of today is the following:

Museums’ missions are stated as providing education through the exhibition and interpretation of their collections [1] and for the vast majority of cultural heritage institutions the interpretative material is static, constituted by long pieces of textual or audio information. According to Georgina Goodlander, Interpretive Programs Manager for the Smithsonian Art Museum “The twenty-first-century audience has an increasingly short attention span, extremely high expectations when it comes to finding and engaging with information, the ability to communicate with friends and strangers quickly and on multiple platforms, and a very open approach to learning”. [2] This paper argues that as audiences evolve, so should exhibition spaces and Shakespeare’s Hunt is an attempt to exploit the benefits emerging technologies have to offer and provide an experience designed according to the audience of our era as described by Goodlander above.

MA Dissertation (A+)

My thesis for the MA Interactive Digital Media apart from the deployment of the Augmented Reality Suite presented in the previous post, includes also an in depth research of the augmented reality and the way it has been applied to the cultural heritage sector.

arsci-header

The abstract:

This is a report for the project The Augmented Reality Suite for Cultural Institutions [ARS:CI] and its accompanying research. ARS:CI is a software suite comprising of three Apple iPhone applications that take advantage of Augmented Reality to enhance the learning experience in the museum context. The potential of Augmented Reality for cultural heritage institutions has been the subject of research for over a decade, however ARS:CI is one of the first projects, which aim at the development of an Augmented Reality solution for museums and galleries that does not include custom-made or expensive hardware, such as Ultra-PCs, but instead takes advantage of the smartphones on the market. The research methods followed were mainly desktop research as well as interviews with professionals coming from the cultural heritage sector. During the software development process user-testing was involved to retrieve feedback from the users, combined with desktop research on techniques that would optimise the performance of the applications. The study proved that despite the large amount of research in the area of augmented reality for museums, there is only a minority of sustainable solutions, which was eventually appropriated for permanent exhibitions, mainly because of the cost of purchasing and maintaining the designed Augmented Reality systems. The feedback ARS:CI received was positive and professionals with a background in the Cultural Heritage sector as well as Augmented Reality specialists, found in it great potential, as a tool that is able to transform the learning process in a compelling experience.

The introduction of the document can be found below. If you are interested in the full document please e-mail me. In addition, since the subject deals with the new media, content such as videos is essential. For that reason, wherever the icon on the right appears in the text, it indicates that there is a relevant video available in the Video Reel below.

Video Reel

The videos accompanying the dissertation can be found below. They appear in the order in which they are mentioned in the text i.e. “The ARS:CI”, “The Virtual Dig”, “An Augmented Reality Museum Guide” and “Mixed Reality for the Natural History Museum in Japan”.


The ARS:CI

The Virtual Dig


The Room


A Magical Start


Brushing


Examining Tusks


Removing Tusks


Examining Masks


Finding Statue


Combining Statue


Fitting Objects

An Augmented Reality Museum Guide

Mixed Reality for the Natural History Museum in Japan

Kondo


When I joined that masters course all I knew was that I like the new media in general and a year later, I find myself in full awareness of the subject of my interest. A zillion thanks to all the people related or unrelated to Ravensbourne who helped me throughout the last academic year.

MA Thesis (Video)

During the last academic year in the course MA Interactive Digital Media of Ravensbourne College I discovered my interest in innovative applications for museums, while at the same time I had the chance to explore a plethora of  new technologies. After long hours of research and practise  as well (i.e. programming, design and user testing) I found my exact subject of interest, which is Augmented Reality iPhone applications for museums and dedicated myself to it ever since. The video below is a demonstration of the largest part of my work.

For the installation of my project at the MA Major Project Exhibition which took place on the 12-16th of July at Ravensbourne College, I designed the following poster which summarises the concept and the functionality behind the prototype applications of the Augmented Reality Suite.

Now I am working on my dissertation which is a documentation of my major project and of the research that supported its concept and implementation. The approximately 10,000 words long essay will be uploaded here by September.

Supernatural 3D Chess

Update: Supernatural 3D Chess is now added to Softpedia’s database of games and it can be found here!

A fully functional human versus human 3D chess game was my project for the Virtual Reality class of my BSc course in Computer Science. Supernatural 3D chess is implemented with Microsoft XNA 3.0 and is available on SourceForge where it’s being downloaded about 100 times per month for almost a year now. A while ago a member of the SourceForge community and XNA developer contributed in Supernatural 3D Chess improving some existing features. Since Virtual Reality class was in the last year of my studies I didn’t have the chance to take it forward by adding Artificial Intelligence or networking so that users would be able to play over the internet. That’s also the main reason of uploading it to SourceForge, though whenever I come back to XNA development it’s the first project I will try improving.

During the game, the user clicks on a pawn and automatically all the available moves on the chessboard are highlighted. The user clicks on a square to make a move, and if this move is not allowed by the rules of chess the appropriate message appears on the upper left corner of the screen. In case the user wants to set the pawns in a specific way on the chessboard or wants to do castling or an passant, there is the feature of God mode which can be easily turned on/off. When a soldier reaches the other end of the chessboard and the God mode is off the user is asked to select to which force the soldier will be upgraded.

In terms of development, though I had previously studied about 3D graphics (for the homonym class in the third year of my undergraduate studies) and transormation matrices, it took me a while until I felt confident with it. As for the rules of chess highlighting the available moves considering the positions of all the pawns on the board etc. was the most interesting part for me.

The source code is available to download on SourceForge, please click the image above to reach the project’s page. In case you are a developer and would like to contribute, would be excellent. Enjoy 3D graphics with .NET Framework!

BSc Thesis (Video)

My dissertation for the BSc(Hons) in Computer Science from the Informatics dept. of University of Piraeus is a Silverlight application uploaded in Vimeo for the artistic archive of the National Theatre of Northern Greece. Microsoft Silverlight is one of my favourite technologies since it combines this designer/developer combination which clearly defines me.

The application is used as an interface for the theatre’s 13Gb database. All the controls included in the application (accordion timeline, sequential fade-in/out menus, thumbnails view etc.) were all written from scratch. I learnt much things during this project and it definitely made me a better developer. I would attach my documentation if it wasn’t in Greek, but in case you do speak Greek please contact me.

Rave in my head (Video)

In the first term of my post-graduate studies at Ravensbourne I did an installation to experiment with a wide variety of technologies from MAX Jitter to non-electronic stuff like wooden structures! It was an attempt to create a multi-touch surface of water and it partially worked to my surprise. During this project I had the chance to experiment with video montage and video effects and as a result I made the following video available in Vimeo.

I used Adobe Premiere Pro as well as Adobe After Effects and I captured the video with my Canon IXUS photo-camera while cycling! That’s the main reason I chose this video to be an abstract psychedelic one – no proper footage! As a result of my first video production experimentation though, I must admit I’m OK with it!

Gclass

In the last term of my undergraduate studies in University of Piraeus, I had one class for learning systems and one for Geospatial Information Systems. My tutors agreed they would both accept the same project which is a combination of these two areas. So I built Gclass, a geography tutoring system using Microsoft Silverlight and Live Maps (now Bing Maps) API. Thankfully, my working prototype got 10/10 in both classes as well as a very good recommendation letter for my MA studies written by my GIS tutor.

The screen was divided in two main parts, on the lefthand side a Silverlight application and on the righthand side a Live Maps control. The first feature of Gclass let the user navigate through lessons paragraph by paragraph and each time he/she moved on to the next one, the map on the righthand side would change position and angle (the user was prompted to turn the map to 3D view) showing a relevant to the paragraph place on earth. The indicative lesson displayed on the prototype was about rivers, so at the paragraph which explained the delta of rivers the earth would turn on and zoom to Nile’s delta. Furthermore, at the paragraph about Amazon river a polygon at its shape would be overlayed on the map highlighting Amazon’s position and size on the map. The appropriate caption for each view appeared on the bottom of the Silverlight application.

The second main feature of Gclass turned the Live Map control to a 2D world map and on the Silverlight application promted the user to draw an area on the world map that he/she would like to discover. Afterwards, the countries included or intersected by the polygon the user drew, were listed on the Silverlight application along with a link to their Wikipedia page. When the user selected a country in the Siverlight application, the Live Maps control would zoom and focus on it. On that part of Gclass I would like to add many more features GIS-wise, but the only data I found available online for free, after hours and hours of searching, were the polygons of the countries of the world.

Regarding technology the Silverlight application communicated with the Live Maps control via javascript. The polygons of the rivers and the countries of the world were stored in a Microsoft SQL database, as the whole website was an ASP.NET website. The Silverlight application would call a SOAP service hosted by the ASP.NET website and the website would then retrieve the appropriate data from the SQL database and return them to the Silverlight application. The GIS functionality e.g. the intersection between the user-drawn polygon and the stored coutries’ polygons in the database was implemented in SQL procedures.

I would be glad to share the source code of Gclass to anyone interested; just send me an e-mail.