A while back, I visited my parents to replace their current digital cable setup with an IPTV setup. This had some great advantages, as the set-top boxes only needed network cables instead of coax cables. I used power line communication so that I didn’t have to lay any cable. In the end, every TV-set in the house was able to watch high-definition television, and every iOS and Android device was capable of watching standard-definition television.
The provider (KPN) has an interesting approach to recording television. When you record a certain program, whether it’s currently on or in the future, you don’t record anything on the set-top box. It is recorded on their servers, and if you want to play it back, you can stream it from their servers to your set-top box. This complicated process is all brilliantly hidden from the user. The advantage of such a system is that a recording on one device can be played back on all devices in the house. Every set-top box can display a list of recorded programs that is retrieved from the server. Another advantage is that you can start a recording from the provider’s website or their mobile application, even when you’re not at home. And you don’t need to have your set-top box running 24/7, which saves power.
But there was one rather large issue with scheduling a recording…
As a continuation on the research I did on automated gameplay with pong and Doeo, I decided to try to play Dance Dance Revolution using a neural network. The project had two research questions: Is it possible to use a neural network to play the game Dance Dance Revolution? And can we do it without having (full) knowledge of the game?
As a small exercise in augmented virtualy, I developed a setup where it was possible to use a real-life candle to light up a virtual object. By placing a WiiMote above a computer monitor, it was possible to track candles (or other infrared sources) placed around the monitor. These lightsources were then mapped into a virtual 3D space (using OpenGL) to shine a light on a virtual object displayed on the computer monitor. A video is posted below to see this effect in action. The full description and the software sourcecode can be found on the project page.
My project payed tribute to GeoCities. The inspiration for the project comes from the GeoCities archive from the Archiveteam. Whilst browsing the nostalgic pages, I couldn’t help but notice how these pages were mainly random pieces of text and images cobbled together. Which is why I asked myself:
Is it possible to randomly generate a GeoCities page?
The project uses the GeoCities archive by scanning the entire archive looking for HTML pages. It then cuts up the HTML pages in small bite-sized pieces, which are stored in a database. When the users tries to access the project website, this database is queried and a completely random page is put together. The end result was a website that from the first look really seemed like a random GeoCities page, but actually wasn’t.
To make everthing feel genuine, the project was presented on a low-color and low-resolution CRT monitor with a virtual machine running Windows 98 Second Edition and Netscape Communicator 4.60 to make everything periodically correct. There also was an algorithm that tried to guess the age of the HTML page in the archive in order to eliminate newer pages from the building process.
Unfortunally, because of the size of the project, and because the hardware / software setup added a lot to the experience of browsing the pages, it is not available online. Screenshots are posted below to get a feel for the project. The full description and the software sourcecode can be found on the project page.