Exhibition in Husqvarna museum

Tacit knowhow is now starting its launch. The first step is an exhibition at Husqvarna museum where the Bodyghost software developed within the project is exhibited in a HTC Vive headset. The museum is open weekdays between 10:00-15:00 and weekends 12:00-16:00. You will be invited to visit the old forge in Granshult where Gustav Thane is forging, the forge Bräcke smedja where Bertil Pärmsten is forging, the forge Therese smedja where Therese Engdahl is forging and a forge in Väring where Julius Pettersson is forging as a VR-experience. The volumetric video contain a full scale point cloud of the blacksmiths tapering a piece of heated metal on their anvils. The four clips are about 10 minutes in all. Go there and check it out if you have the chance. If not, you can set up a HTC-vive system wherever and wait for us to share the files recorded and the software to play from… in February  2020.

Photogrammetry in an old forge

Before the launch of the Husqvarna museum exhibition we finally made one further improvement. We built a proper 3d environment, using Autodesk ReCap Photo and a Structure Sensor by Occipital, Mark II. The Journey went to Granshults faktorismedja just north of Jönköping where we made a final recording with our own blacksmith Master Gustav Thane. The small forge, just under 5×5 meters was scanned as a complement to the volumetric recordings with the multiple Intel RealSense cameras. The point clouds generated where placed by the anvil in the scanned 3d-inviroment. This really heightened the immersiveness of the experience.

Two things you need to know if you want to try this at home. (1) The Structure Sensor use an Iphone or Ipad to operate, if there is no internet connected, it will not work. It pretends to work, make all the sounds and trigonometry visualization, but it don’t. So even if you are stranded in a 17th century forge in the middle of nowhere you need to set up a wi-fi or equivalent to make it work. (2) The Autodesk ReCap Photo make really nice photogrammetry of objects, and it can be used with a drone, but it do not give you a room. So we had to pretende to capture the anvil, but we did so from a distance of almost 2 meters, getting a whole lot of the floor and room in the same pictures. This made ReCap automatically construct almost the whole forge in relatively high resolution, (photogrammetry is a technique where you take a lot of photos, 100 or so, and a software construct a 3d model based on it).

Eventually we did paste a few details from the Structure Core but those cannot compare in resolution and quality with the photogrammetry of ReCap and a high resolution professional grade camera. The problem was that the ReCap model turned out to be an over 1 GB file. It looked amazing but to avoid lag we scaled it down to 20% of full quality making it almost as poor quality as the Stucture Core files. Another problem was that the ReCap  was focused on the anvil. The workbenches and tools lying further away in the forge became sort of morphed. So we used a 3d modeling software to simply paste workbenches and a piece of the roof into the model, scaling it to life size and making sure the floor laid in the right place.

It looks a lot better, and it is so much fun to be able to walk around in the forge, on a floor, looking at tools and stuff.

First public appearance

As it happened, a possibility to exhibit videos recorded with this new technology appeared. The Swedish craft and design museum, Röhsska Museum, had an event where alternative and practice based learning and teaching was in focus. It was a perfect forum for testing our new technique on a real life audience. Now everything had to happen in one week. Intense coding to finish the software, journeys around the country to record some of the most skilled blacksmith and more coding since a few small thigs turned out to be less than optimized when running a sharp project. Still, both hardware and software proved surprisingly stable. The cameras shut down sometimes but that was easily fixed by pulling out and then reinserting their usb3 cable. Recording mode was otherwise running flawless (on the second highest resolution), the only challenge was that the cameras needed to stand between 0,7-1 meter away from the one recorded to get enough sharp pixels. Playback mode was a little bit less perfect. Still almost no bugs or unwanted side features, the ones discovered where easily dealt with by turning of and then on the software, but the audience at Röhsska reacted on the pixelated result. The cameras record beautiful point clouds from where they stand but also less perfects points in their periphery. When we combine four great sides into one body the four great sides bring eight peripheries with them, resulting in bad quality pixles hanging in the air just outside of the high quality point cloud. Otherwise, this new way of recording and documenting activity is everything we could hope for. It is absolutely awesome to be able to put something movable from our physical environment and put it into a digital one without animation. We just turned the perspectives of what “virtual” reality is to reality.

The awesome blacksmiths recorded are Bertil Pärmsten at Bräcke smedja, Therese Engdahl at Therese smedja and Julius Pettersson at Manufaktursmide.

Multiple Intel RealSense cameras at once

A deep dive into the RealSense cameras from Intel has proven fruitful. Gunnar and Mikael have written new software in C++ parallel to extensive construction work in unity. Today the whole team engaged in a first recording of a proper VR-video clip recorded with 4 Intel RealSense cameras plus sound. Gustav Thane took up the hammer again and made his thing. It became clear that a more user-friendly interface for handling the files is needed.

Prior to this session Gunnar spent some time on making a rather intuitive method to calibrate the four cameras to each other with the hand controls in the VR environment. Mikael built a way to paste a 360 movie to the walls and roof of the space where the textured point cloud is moving around. But most of the time Mikael’s skills in programming Unity and Gunnar’s mastery of solving problems have been just the combo to push the project forward. A brand new computer had to be purchased too. The old one was kick-ass a year ago but had difficulties handling the heavy files and processor-draining recording and playback of the files. The playback of four simultaneous point clouds is rather heavy on the processor so we’ve been experimenting with using a decimation filter on the point cloud rendering, resulting in a better frame rate… but in the end a new GPU was our only choice (The new computer is has a RTX2080 and an intel core i7-9700 K and a proper SSD-drive). Today the depth sensors are recording on almost full capacity 848×480 pixels (full is 1280×720) whilst the texture is recorded in 1280×720. The reason we do not use full capacity for the depth sensor is because that would only give us 15 frames per second and 30 pixels seemed more attractive at the time… still do. Soon we are ready to do a proper rehearsal with actual audience.

Hot as hell

Gustav in VR-gear

Some warm winds blew in today’s investigation when we finally got to use some hot steel! Wow! Yes, you assumed correctly, we will try to measure the temperature of the steel when forging. Our blacksmith has the day in honor brought to us a forge since we were going to meet a measurement-guru from Termisk Systemteknik. Mr. Guru (who usually goes by the name of Claes Nelsson) brought some nice heat cameras with him to show us their almost magical properties. These heat cameras are sensing IR to measure the temperature. However, they need to have a few reference points as calibration of the temperature of the steel to give a correct measurement and visualization, so old SP has lent out their amazing self-built thermocouple data-logger together with a few thermocouples. Thank you!

Let’s start! The first critical point of measuring the temperature was to attach the heating element to the steel. How would we do this? Our blacksmith came as a rescuer and tempered his steel by cutting a score in it, which we then could use to attach the thermocouple to it. We needed to do this to measure the emissivity of the steel.

Emissivity of metals is a difficult area, which is currently being researched. All materials will emit electromagnetic radiation when heated, and this radiation varies with the temperature of the material. Further there is something called a black body. A black body is quite simply a physical body that is perfectly black, and thus will absorb all incoming radiation and not reflect anything. When such a body is heated it will, just like any other physical body, emit radiation in the form of electromagnetic waves, and this is then called “black body radiation”. But in reality there is no such thing as a perfectly black body. This is where the emissivity comes in handy. Because the emissivity is simply a material property that is used to describe how close a certain material is to having the properties of a black body, in the form of a fractional number between 0 and 1. And thanks to the physicist Max Planck there is an equation called Planck’s Radiation law, that describes the frequency distribution of electromagnetic waves that are emitted from a black body at a certain temperature in temperature equilibrium. So why is this important to us then? Well, because if we know the emissivity of a material AND the distribution of electromagnetic waves coming from that same material, we can then find the temperature using an IR-camera. And this time the emissivity was calculated by actually measuring the temperature using the thermocouples, something which was then fed into the software of the IR-camera to find the correct emissivity that made the readings on the screen of the IR-camera match our measurements.

So why did we go through all this trouble to measure the temperature of the steel? It is significant for us to measure the temperature and try to find a solution to how it should be visualized, as the temperature is an important parameter in the forge exercising. Unfortunately, we did not get convinced that this method was the right way for us as the results were too uncertain. A few problems that struck us when we used these cameras was that it did not give a constant value, and it was also a bit too expensive.

Today, a lot of things also happened in addition to testing the heat cameras. An attempt was made to program a algorithm to measure brightness using a regular camera (and thus somehow being able to extract the temperature from that reading), and we continued the work with the depth camera. We took a few steps forward and the journey progresses.

And from all of us to all of you a very merry christmas!

Argus-eyed

What has happened here lately?

We have continued our investigation of how to use depth cameras. This time we took the challenge to another level. Our geniuses had an attempt at connecting all five depth cameras. Five… How on earth did we think there? Good question, but the one who has not tested weird stuff will not continue to get involved in a fairytale tale! Of course, our evolving talents want to overcome the latest obstacles and challenges in the project’s technological development, and walk the narrow path that is this enchanted story, but this time unfortunately it did not go all the way.  The five cameras did not log the picture, and one of the cameras delivered only a still image. We suspect this is because there is too much data being sent, which exceeds the capacity of USB 2.1.

Worth noting is also that we tripped and fixed for about half a day, and since then we’ve had another half-day where we met our sister studio: RISE Interactive Umeå, to discuss the project. RISE Interactive Umeå have done and are doing a lot of projects that are of interest to us in Tacit knowhow and we hope that we can join forces in this project in the future.

In our video blog you can see that we are working on creating a 3D printed jig for the camera and the tracker. The jig is intended to facilitate the tracing of the camera’s position. At present, no final version has been produced, which means we continue to iterate on the prototype. This wasn’t a story of happily ever after and our story simply continues…

To look into the depth

The photo and experiment studio

After the last episode of technical trolling, our amazing team are set against a gigantic obstacle, where HTC vive constituted a significant resistance to the minds of Niels, Gunnar, and Mike, but through guidance and discussions with our Blacksmith, they will be able to find a new path that can lead them towards the goal.

This time we were set on using depth cameras, which helps us to measure the distance of an image with depth technology and active infrared (IR). Our depth camera uses depth sensors, an RGB sensor, and an infrared projector. We have an idea that several perspectives and angles would allow the camera to follow the movement and create a more accurate and detailed picture than we were able to achieve in our previous experiments. Something that should be noted is that the cameras’ updated application programming interface (API) made it easier for us to pair the two cameras. Hence, it was a great occasion to use two cameras this time. In this experiment, the cameras were used both in the physical room and in the 3D-room, which meant that our tech savages had to work with matching the positions in boh the 3D-room and the physical room. They successfully connected the cameras, HTC Vive and the trackers to a computer and to the programming software Unity. The HTC Vive and the trackers were a good aid to reproduce mobile position data, which made it easier for any of the cameras to be moved. The use of the tracking system enabled us to get a rough calibration and provided a basic calibration, but still a fine calibration was required to provide a true mirrored image in the 3D space. Part of the calibration of the camera was that they created a maximum area that the camera could use. The key word in this test was calibration, calibration, CALIBRATION!

Gunnar and Mike worked to calibrate the cameras manually both in the current position and with respect to the rotation. However, it is very difficult to get the calibration really good when doing it manually. In this case one needs to know exact positions in the physical space and the camera’s rotation. Something else that became problematic was the recording. The recording on both cameras did not start simultaneously, which caused a shift in the recorded image. One problem is also that the recorded files are very large and thus it becomes demanding to perform any form of editing. Hence, neither the recording nor the playback was optimal at this time.

Something that was spectacular however, was the visual experience where the viewer was able to meet his/ her external form. Like Narcissus caught in the 3d mirror image the viewer was filled with curiosity to the pictured self. Yes, I promise it was a splendid view and hard to be separated from this beautiful creation.

 

Motion capture results and a new direction…

The anvil prepared for the day’s work!

As cold and crisp morning dawned in Gothenburg the crew gathered at the RISE Interactive Studio ready and eager for a productive day of testing. Today (or well, when writing this it was actually three weeks ago…) was a big day, since we were going to test our tracking system together with a real blacksmith’s oven in order to see if that would disturb the measurements. The tracking itself had also been improved by our Augmented Reality-sages Gunnar and Mike, and should now posess such features as automatic calibration, making the initial setup a breeze. Or so we thought…

At first we pondered the possibilities of setting up the equipment inside the studio in our VR-area, where we had performed the testing the previous time. But since we figured that maybe the rest of the studio would not appreciate listening to the “DING”-sound of a hammer hitting a vice for several hours we decided to move all of our equipment outside. This included two computers, the whole HTC-vive tracker system, the anvil and tools for blacksmithing, and the heater. It might be worth pointing out that the heater is not what most people might concieve when they think of a heater in a forge (a huge oven with an open fire), but a modern counterpart based on induction. So no HE-MEN were required in the moving of the heater, a strong blacksmith master named Gustav proved to be enough.

With all of our equipment outside Gunnar, Mike and Niels went about the business to calibrate the trackers so that we could get ready for testing. This however turned out not to be quite the walk in the park that we had anticipated. Instead, initially nothing really went our way. When doing the calibration in a static position everything seemed to be fine. But when we started to move around with the camera it seemed like that the tracking of the camera started to drift, resulting in the markers on the anvil moving when we were moving around with the camera. We kept trying until we got a calibration result that was just good enough so that we could actually start with the testing. And just when everything seemed fine and dandy and Gustav was about to start the induction heater, well the heater didn’t work anymore.

Time to get creative! So Gustav brought forth a portable propane heater and built a small oven from a couple of bricks and was able to heat the glorious piece of metal to at least somewhere near the optimal temperature. So finally we were able to test our tracking system and generate traces! Too bad they really didn’t act as we wanted them to. Instead they pretty much went haywire and as soon as Gustav struck his hammer onto the anvil they completely went wild. So Gustav tried to mount the tracker unit on a saw instead, but we still experienced the same problems.

So what went wrong? Regarding the problems with calibration we believe that it might have been an issue with the stability of the tripods that we used for the HTC light house laser trackers (the laser sending tracking boxes that the HTC-system uses to track the controllers using reflected laser pulses and trigonometry), that when they swayed even just a little bit the calibration was ruined.

Regarding the problem with the traces, our analysis was that the built in IMUs (Inertial Measurement Units) that got a very high acceleration value when the hammer hit the anvil, and since they will always contain a small error in the measurement due to the accuracy of the accelerometers this caused the problem. When the acceleration gets very high, this error becomes proportionally larger, and since that acceleration is then integrated twice to achieve the position, this error in measurement gets even bigger resulting in the position of the tracker marker being completely off. This remains until the laser from the HTC light house towers manage to bring the tracker point back to its correct place.

So where do we go from here? It might be possible to turn off, or change the weighting function to make the IMU listen less to the signal from the accelerometers. We have also pondered the possibility of using Intel’s “Real Sense” depth cameras to create a 3-dimensional depiction of the work, but without traces this time.

But that is something for next time!

First tests with motion capture

Motion capture experiements with Gustav and Gunnar in the RISE Interactive studio

This can be seen in video (HD off course) on our brand new Youtube Channel “Tacit knowhow”.

Today the group consisting of Gustav, Gabriella, Gunnar, Mikael, Peter and Fredrik got together and performed a first test of a motion tracking technique using HTC Vive together with Vive Trackers (thank you Valve!) to try and track the hammer movements as performed by our master blacksmith Gustav. After a couple of days of hard work getting the hardware and software (we are using Unity) working together Gustav got to show his skills with the hammer.  Or at least wave his hammer around with the rest of us trying to track it.  The test, eventhough it was rather crude, was a success! We were able to add an animation showing the movement of the hammer as a highlighted track in a 3 dimensional space.

The purpose of todays testing was to see how we can use this technology to record the motion of tools in order to better record, display and teach tacit knowhow, somehow. It is just a first test, so we do not yet know if this particular technology will be used, but it sure looks promising!

Start-up meeting!

Today (17/10-18) we are glad to announce that we had the first meeting in this very exciting project that we call Tacit Knowhow! The meeting consisted of Gustav, Peter, Gunnar and Fredrik who took the time to all discuss the outlines of the project.

We are all looking forward to this project, and how to capture and present tacit knowhow!