Te Ngaru Paewhenua - Hei Matau
In the summer of 2020/2021, I recieved a Te Ngaru Paewhenua: The Landward Wave Science scholarship which funded me to work on two heritage projects at the University of Otago.
The first was a digital preservation project, 3D modelling items of cultural significance. The second was a 3D art project, using items from early settlers in the Otago Museum. Both of these projects heavily featured photogrammetry.
Photogrammetry is the process of creating a digital 3D model of an object from lots of images. This means that a physical object can be preserved for later examination, processing, and viewing. Because we were working with items of great cultural significance, lots of time was spent improving the photogrammetry process. We tried our best to ensure that the resulting 3D model was as close as possible to the original object.
3D Modelling Items of Cultural Significance
Many items with great cultural significance are housed in museums and other collections but are not readily accessible due to limitations of display space. Even those objects on display are static and cannot be handled or examined in detail. 3D modelling, virtual reality, and 3D printing provide new ways of making these items accessible. The objective of the project was to develop tools and pipelines to make this process easier. We worked with Gerard O’Regan (Curator, Māori, Otago Museum), Lana Arun (Assistant Curator, Māori, Otago Museum), Jennifer Copedo (Assistant Collection Manager for Humanities, Otago Museum), and Vicki Lenihan (multimedia artist).
Methods
Otago Museum staff took thousands of photos of select taonga which we then processed to create the highest quality 3D digital model we could using state of the art software.
Paemanu: Tauraka Toi, A Landing Place
The second half of the project involved working with mutlimedia artist Vicki Lenihan, 3DFY.me, and the Otago Museum. She aimed to create accurate 3D models of Māori crafted iron fish hooks, but using a modern material, New Zealand steel for the Paemanu: Tauraka Toi, A Landing Place exhibition at the Dunedin Public Art Gallery. We performed the photography of the fishhooks, creating the 3D model, and organising the 3D printing.
Methods
The second subproject involved photography, photogrammetry using Agisoft Metashape, and manual editing of the 3D model using MeshLab.
We were photographing three fish hooks made by Māori when the early settlers arrived. Otago Museum provided an ample space for taking photos of the objects. To create the highest quality 3D model, the highest quality photos must be taken of the object. To achieve this, the camera settings and lighting had to be perfect for the photo shoot. We experimented with several different settings and found that using a combination of the lowest ISO possible, highest aperture possible, and 1/10 shutter speed worked well for the setup. With increased ISO the resulting image loses quality, and with a high aperture the whole object will be in focus. As with any information system, garbage in, garbage out. Each image was taken using a remote, and flash.
For each fishhook, they were positioned in 4 different positions to view every angle of the object. For each of these positions, we used an automatic turntable to take 28 photos at each position. In addition, we used 5 different heights on the tripod to get 5 different angles. We took 775 photos of hook 1, 707 photos of hook 2, and 587 photos of hook 3. Approximately 80% of these photos were able to be used in the final processing, due to some angles not presenting enough information, or blemishes/shine was present.
Once the photography was done, we moved into the software side of photogrammetry. Agisoft Metashape was our software of choice, since they are industry leaders in photogrammetry used by archaeologists and historians around the world. There is alternative, free software available, however we highly recommend Metashape for any projects requiring precision. Metashape makes photogrammetry easy. We:
- Convert the RAW images to PNG using a custom Python script
- Imported the photos to Metashape
- Creates meshes for each photo, i.e. removing any background noise to help the software
- “align” the photos (let the software guess where each photo was taken in relation to other photos using common features)
- Generate a dense cloud of vertices
- Overlay a mesh onto those vertices to create a 3D model
With the photos for hook 1 alone taking up 20.4GB of space, it wasn’t uncommon for the processing time to take 12 hours for each hook, let alone a day or two to create high quality meshes for each image. This meant most of the computer processing was done overnight, and every morning involved having crossed fingers and hoping it had worked. It is impossible for Metashape to understand how large an object is without a scale; this is why the final 3D model also included a small ruler. This means that when the team at 3Dfy.me takes the 3D model, they can print it to be the same size as the real object. This was an additional, manual process that had to be done.
After all three 3D models had been created and scaled to their correct size, they were sent to 3Dfy.me, and were 3D printed using a SLS nylon.
Results
The project was a success. The 3D printed models appeared faithful to the original and looked almost identical thanks to Vicki’s handiwork. Due to time pressure, Vicki was unable to have the 3D models printed in New Zealand steel, so opted for a plastic resin. She then hand painted these to match the exact material and texture. The resulting 3D models are currently on exhibition until the 25th of April 2022 as part of the Paemanu: Tauraka Toi, A Landing Place exhibition.
VICKI LENIHAN (Waitaha, Kāti Māmore, Ngāi Tahu, Kāti Huriapa, Ngāi Tūāhuriri) He Whao | Skills 2021 (detail) He kai kei aku rikarika – there is food at my hands. SLS nylon and steel acrylic paint
The most arduous and time-consuming part of the project was masking. We found that automated “magic wand” tools in image processing did a pretty good job of masking an image, but it was never perfect on the first try and always required a manual touch up before we could use that image for photogrammetry. We tried overexposing the images on purpose to create a sharp edge around the object, which did help, but each image still required some manual labour.
We think that a smart machine learning driven “magic wand” tool could customise the behaviour of the crop to the user, saving potentially days of manual processing time.