Welcome to this TPW Workflow Series on keeping a record of the process of 3D scanning and (post-) processing. This post is useful to read both prior and after following the scanning and processing workflows. A critical awareness of what the operator does, choices he or she makes in particular software and hardware settings or the selection procedure of the archaeological material to scan, affect the digital outcome. A reflexive approach to the practice of scanning and the role of the operator could therefore increase the reproducibility of the production of archaeological data and the resulting knowledge. If the process is reproducible, datasets produced by different projects can be compared more accurately.
Documenting the process ensures transparency of not only data retrieval itself but also the circumstances that might affect the data retrieval. Every project has its specific research aims, and each specialist might have clear ideas about what to select and how to document, while others might have contrasting ideas and expectations. But also more general circumstances leave their traces in the resulting 3D model. For example, was the scanner standing on a solid, concrete floor, a quaky Greek island, or on a shaky attic of an old building? Are there train lines nearby, causing vibrations? How is the light? Was the archaeological object black or had it surfaces with different colour hues? We call these traces ghosts artefacts as they do not exist in the original object, but were created in the process of recording (for more information see the posts dedicated to this and the Sketchfab page). All these different kinds of issues need to be recorded to ensure a certain shared standard in data quality.
Metadata and paradata
Many definitions of paradata exist, yet they share a basic idea: paradata is information about human processes of understanding and interpretation of data objects (Huvila 2012, 98). In TPW, paradata is information about the data collection procedure, working conditions and other circumstances, which are mostly described in the detailed TPW workflows. Metadata is data about data, including technical issues such as scanning parameters and details about the raw scans, and post-processing settings employed to generate 3D models. Keeping a record of this process enables “to test the validity of our conclusions with more precision and confidence” (Frischer 2008, v) and to track certain choices or correct errors that were made in the process. Changing elements in this chain of data collection, creation and processing, might affect ensuing interpretations.
Accessibility: Multiple extensions
The first thing to do to make 3D models accessible is to create as much file extensions as possible. There is no standard file extension yet, although there are several quite standard files which are accepted by most projects and software packages, such as OBJ and PLY. In most of the cases, your native file cannot be opened in another program or 3D modelling software.
TPW exports most of its native files (.hpscanproject) to OBJ and PLY, and occasionally to PDF, STL and DAE too. Many of the files are also decimated, meaning decreased in file size, as almost all native files are more than 1GB. Indeed, when decimating files, data is lost. To overcome this, all files – native, exported and decimated – are disclosed as downloads through the online TPW database. Users with ranging computing facilities can then choose compatible files they can work with.
Save the data!
The first thing to do is to save two versions of the native scan file: the raw data and the work file. At TPW we usually add the operator’s initials or CUT to the objectID.
In the work file you can clean and cut the scans as much as needed (= data manipulation, often termed ‘enhancement’), which is dependent on the standards and the experience of the operator. As this is a subjective process, the original raw data is saved in the other file.
The alignment and fusion of separate scans into a mesh (the 3D model) is then exported to OBJ and PLY. The OBJ is then decimated to approximately 180-200MB using Meshlab with the settings as indicated in the post-processing workflow. This first decimation preserves most of the morphological details and is small enough for online display on Sketchfab. Further decimation might be required for other online platforms or to create 3D printable files. For the latter TPW provides in several cases STL files.
Metadata documentation and the TPW metadata sheet
The technical specifications of these manipulations should be recorded. TPW developed a spreadsheet for this, which can be downloaded: TPW_MetadataSheet.
The following data is recorded in this excel sheet:
Sharpness and Close Holes are functions that enhance parts in the scan that are left blank or unclear. These functions do something with data or even create data (when closing holes) and therefore need to be recorded.
The amount of vertices can be found in Meshlab, see image below:
In order to define the average resolution before fusion, make visible all separates scans in the List of Scans in HP Scan Pro (and not the FusionResult!). In the Fusion tab to the left of the viewport you can then find the average resolution of all those scans.
The FusionResolution is displayed in the same tab, but now you need to make only visible the FusionResult in the List of Scans.
Once you start decimating the 3D model in Meshlab, and perhaps do some post-cleaning, the geometry of the 3D model will be further manipulated or changed, and these changes must be recorded in the DecimationNotes column in the metadata sheet. You can find the data of these alterations in the Layer Dialog to the right of the main viewer. You only need to copy the actual changes, in this case the null faces and duplicated vertices the decimation filter in Meshlab has removed.
A critical note
Very good, you are reflexively recording the scanning and post-processing procedure. You are aware of what you are doing and how others can trace this process in order to compare their dataset with yours. This process is, together with the contextual, archaeological data and the descriptions of selection procedures, the most important thing to document, and which ensures the actual sustainability of data. Native files risk to be unsupported, specific brands of 3D scanners (such as DAVID) seize to be produced, technology becomes obsolete, as are their ‘high resolution’ and ‘quality’ of digital outputs. These are always momentary, depending on the potential of a particular piece of equipment at a particular instant in the vortex of technological progress…
Frischer, B. (2008). From digital illustration to digital heuristics. In B. Frischer & A. Dakouri-Hild (Eds.), Beyond Illustration: 2D and 3D Digital Technologies as Tools for Discovery in Archaeology (BAR International Series 1805). Oxford: Archaeopress, pp. v–xxii.
Huvila, I. (2012). The Unbearable Complexity of Documenting Intellectual Processes: Paradata and Virtual Cultural Heritage Visualisation. HumanIT, 12(1), 97–110.