BioCity’s Discovery Building with Corona display — 360 Video and Photographs

BioCity’s new Dis­cov­ery build­ing was unveiled a few days ago, with a unique solar instal­la­tion, titled Coro­na, designed by Not­ting­ham artist Wolf­gang But­tress in part­ner­ship with Not­ting­ham Trent Uni­ver­si­ty physi­cist Dr Mar­tin Benc­sik. Fibre-optic lights and alu­minum tubes use real-time solar data from NASA, cre­at­ing a light dis­play which is always unique.

My 360 video with spa­tial audio:

360 still images:

360 Image of BioCity Discovery Building

360 Image of BioCity Discovery Building

 

Spatial Audio for VR with a Ricoh Theta S Camera and Zoom H2n Audio Recorder

I had a try adding spa­tial audio to a VR video. In the­o­ry this should add real­ism to a 360 VR video by adding audio that can be processed to play back dif­fer­ent­ly depend­ing on the direc­tion of the view­er.

I updat­ed the Zoom H2n to firmware ver­sion 2.00 as described here https://www.zoom.co.jp/H2n-update-v2, and set it to record to uncom­pressed WAV at 48KHz, 16-bit.

I attached the audio recorder to my Ricoh Theta S cam­era. I ori­en­tat­ed the cam­era so that the record but­ton was fac­ing toward me, and the Zoom H2n’s LCD dis­play was fac­ing away from me. I pressed record on the sound recorder and then the video cam­era. I then need­ed a sound and visu­al indi­ca­tor to be able to syn­chro­nize the two togeth­er in post pro­duc­tion, and click­ing my fin­gers worked per­fect­ly.

I installed the http://www.matthiaskronlachner.com/?p=2015. I cre­at­ed a new project in Adobe Pre­miere, and a new sequence with Audio Mas­ter set to Mul­ti­chan­nel, and 4 adap­tive chan­nels. Next I import­ed the audio and video tracks, and cut them to syn­chro­nize to when I clicked my fin­gers togeth­er.

Export­ing was slight­ly more involved. I export­ed two files, one for video and one for audio.

For the video export, I used the fol­low­ing set­tings:

  • For­mat: H264
  • Width: 2048 Height: 1024
  • Frame Rate: 30
  • Field Order: Pro­gres­sive
  • Aspect; Square Pix­els (1.0)
  • Pro­file: Main
  • Bitrate: CBR 40Mbps
  • Audio track dis­abled

For the audio export, I used the fol­low­ing set­tings:

  • For­mat: Wave­form Audio
  • Audio codec: Uncom­pressed
  • Sam­ple rate: 48000 Hz
  • Chan­nels: 4 chan­nel
  • Sam­ple Size: 16 bit

I then used FFm­peg to com­bine the two files with the fol­low­ing com­mand:

ffmpeg -i ambisonic_video.mp4 -i ambisonic_audio.wav -channel_layout 4.0 -c:v copy -c:a copy final_video.mov

And final­ly inject­ed 360 meta­da­ta using the 360 Video Meta­da­ta app, mak­ing sure to tick both ‘My video is spher­i­cal (360)’ and ‘My video has spa­tial audio (ambiX ACN/SN3D for­mat).

And final­ly uploaded it to YouTube. It took an extra five hours of wait­ing for the spa­tial audio track to be processed by YouTube. Both the web play­er and native Android and iOS apps appear to sup­port spa­tial audio.

If you have your sound recorder ori­en­tat­ed incor­rect­ly, you can cor­rect it using the plu­g­ins. In my case, I used the Z-axis rota­tion to effec­tive­ly turn the recorder around.

There are a lot of fas­ci­nat­ing opti­miza­tions and expla­na­tions of ambison­ic and spa­tial audio pro­cess­ing avail­able to read at Wikipedia:

The orig­i­nal in-cam­era audio (Ricoh Theta S records in mono) to com­pare can be viewed here:

Sony Noise Cancelling Headphones as Binaural Microphones

My Sony smart­phone has an unusu­al TRRRS (Tip-Ring-Ring-Ring-Seal) con­nec­tor, allow­ing it to use very rea­son­ably priced noise can­celling head­phones that have an extra micro­phone in each ear­phone.

I found that the Sony app Sound Recorder allows select­ing record­ing direct­ly from these two micro­phones, and are great for bin­au­r­al record­ing, and I gave it a go walk­ing along a few busy streets. You can lis­ten on YouTube and Sound­cloud:

A Droplet for KRPano for Publishing 360 Videos

Here is the first ver­sion of a sim­ple droplet for con­vert­ing and pub­lish­ing 360 panoram­ic videos. It is intend­ed to be used for the processed out­put file from a Ricoh Theta S that has the stan­dard 1920x960 res­o­lu­tion. It is easy to do man­u­al­ly, but many peo­ple asked for an auto­mat­ic droplet.

It con­ve­nient­ly includes 32-bit and 64-bit ver­sions of FFMPEG for per­form­ing video con­ver­sion.

Instruc­tions:

  1. Extract to your KRPano fold­er.
  2. Drag your MP4 video file to the ‘MAKE PANO (VIDEO FAST) droplet’.
  3. Be patient while your video is encod­ed to var­i­ous for­mats.
  4. Rename the fin­ished ‘video_x’ fold­er to a name of your choice.

You can down­load the droplet here:

Recent improve­ments include:

  • Adding three vari­a­tions of qual­i­ty, which can be accessed by the view­er in Set­tings.
  • Improv­ing the qual­i­ty of the default play­back set­ting.
  • Auto­mat­i­cal­ly switch­ing to the low­est qual­i­ty when used on a mobile device.
  • Using a sin­gle .webm video, as the for­mat is very rarely used, and very time con­sum­ing to encode.
  • Out­puts to a named fold­er.

Here is a demon­stra­tion video and anoth­er.

Ama­zon Wish List 😉

Removing JavaScript Debugging in Production with Laravel Elixir

While using Gulp with Laravel’s Elixir, I found while it minifies/uglifies JavaScript on a pro­duc­tion build, it doesn’t strip JavaScript debug­ging. It was also far more time con­sum­ing to imple­ment this as a cus­tom Task or Exten­sion.

Strip­ping debug­ging allows you to freely use Console.debug() and sim­i­lar debug­ging calls in devel­op­ment, which oth­er­wise will reduce the per­for­mance of your JavaScript appli­ca­tion, and in some cas­es make them com­plete­ly unus­able to cer­tain browsers.

So I did it myself, and made a Pull request (Github) with the offi­cial Lar­avel Elixir repos­i­to­ry, which was approved. Nice to give back.

Github Pull Request for Laravel Elixir

Open Bionics robotic hand for amputees

“A pro­to­type 3D-print­ed robot­ic hand that can be made faster and more cheap­ly than cur­rent alter­na­tives is this year’s UK win­ner of the James Dyson Award.” (BBC News link)

This is a fan­tas­tic idea, which has so much val­ue to peo­ple with­out limbs. Bion­ic pros­thet­ics can cost up to £100,000, and £30,000 for a sin­gle hand.

The 3D-print­ed robot­ic hand in the arti­cle costs £2,000, which is the same price as a pros­thet­ic hook, and offers sim­i­lar func­tion­al­i­ty to the top-of-the-range options.

The design­er gets to devel­op his inter­est in cre­at­ing a prod­uct, while help­ing the esti­mat­ed 11 mil­lion peo­ple who are hand amputees world­wide.

Open Bionics

Jonathan Hassall