Te Ngaru Paewhenua - Lithic Analysis
In the summer of 2019/2020, I recieved a Te Ngaru Paewhenua: The Landward Wave Science scholarship which funded me to work on a heritage project at the University of Otago.
My goal was to continue previous research with the Department of Computer Science and the Department of Archaeology which looked at Māori tools made from pakohe/argillite. Ngāti Kuia, a tribe/iwi based in the Nelson-Marlborough region where pakohe was common, traded the ideal tool-making material to other iwi and pākehā around Aotearoa. The main archaeological evidence consists of debitage – stone flakes removed during tool manufacture. My project was to develop automated tools for the analysis of these flakes in situ.
A big thank you to Dr. Steven Mills who supervised me for this project. Much of this project was a continuation of two papers:
- H. Bennani, S. Mills, R. Walter and K. Greig, “Photogrammetric debitage analysis: Measuring Maori toolmaking evidence,” 2017 International Conference on Image and Vision Computing New Zealand (IVCNZ), Christchurch, 2017, pp. 1-6, doi: 10.1109/IVCNZ.2017.8402463.
- F. Petrie, S. Mills, H. Bennani, R. Walter and K. Greig, “Stitching Partial 3D Models with an Application to Modelling Stone Flakes,” 2019 International Conference on Image and Vision Computing New Zealand (IVCNZ), Dunedin, New Zealand, 2019, pp. 1-5, doi: 10.1109/IVCNZ48456.2019.8961032.
Thanks to the researchers who worked on these papers, I was provided with all of the original source code, which made progress straight forward.
In the first few weeks, I focused on becoming comfortable with photogrammetry (creating digital 3D models from a collection of 2D photos), implementing some common statistical analysis tools, and to become proficcient at manipulating the 3D models using Python.
One can create a 3D model of pretty much anything. All you need is a lot of high quality photos.
I gained confidence in manipulating 3D models, I understood once unfamiliar mathematical concepts and I was then ready to begin developing the stone tools processing pipeline.
Step 1: capture photos of subject
Step 2: examine the resulting 3D model
Step 3: process the 3D model
The original stone processing software was built in C++ and featured some Matlab scripts. It wasn’t managed by one central application, had no instructions, and had lots of steps. I scrapped the old software and programmed the stone processing pipeline from scratch using the Python programming language.
https://github.com/rmyj/stonetoolprocessing
In the stone processing tools pipeline, I implemented statistical techniques such as RANSAC, PCA and k-means clustering to process the data into a tidier form.
Step 4: tidy the 3D model for further processing
From the tidy data, I was able to use principal component analysis to make several measurements useful to archaeologists: the maximum dimension (A), percussion length (B), maximum length (C), maximum width (D), and maximum thickness (E). These measurements are used by archaeologists to make inferences about the debitage site.
Step 5: for each of the flakes as a 3D model, measure each dimension
Conclusion
After we had a functional pipeline, we could compare the performance of the old and the new. To do this, we compared the sum of squared absolute errors of all five measurements for each flake and the new pipeline showed less error when compared to old software. We are happy with this resutlt and the improvement can likely be explained by improvements in the underlying photogrammetry software, and the fact that we used different techniques to measure the 3D flakes.