In many forensic scenarios criminals often attempt to conceal their identity by covering their face and other distinctive features. In such cases, the material can however reveal other unique features which can be used for identification, such as the hand(s). Many state-of-the-art (SOTA) hand-based biometric systems can accurately identify individuals in constrained environments. Nevertheless, for forensic investigations, the environment is often unconstrained, and the identification becomes considerably more challenging, resulting in a decrease in accuracy.
In the thesis, different ways of enhancing the performance of SOTA hand-based identification models in unconstrained environments are explored. This is done by determining the performance improvements of implementing hand-alignment, fusion and loss function optimisation. Multiple hand-alignment methods are explored, including a simple rotation, an affine transformation and a palmprint extraction. Furthermore, several fusion techniques are evaluated, including both score level and rank level approaches, while only one alternative loss function was evaluated, namely the additive angular margin loss.
The results of this study highlight that applying hand-alignment and fusion techniques can improve the identification rate of the SOTA hand-based identification models. Specifically, the simple hand-alignment and the score level fusion with Weibull normalisation emerge as the superior methods. Combining both hand-alignment and fusion techniques, reports the highest absolute performance improvement, as much as 18.8\%. However, the results obtained from implementing the additive angular margin loss function are inconclusive, as they do not consistently improve performance.
Despite optimising the SOTA models, the identification rate in unconstrained environments is far from optimal. Therefore, future research is required to determine if there exist other methods leading to even higher performance improvements. This includes exploring other hand-alignment methods, image enhancement and using large-scale data.