Analysis of catch for stock assessment and reporting is becoming increasingly important. The future may hold more stringent requirements on vessels to be capable of such analysis. Many of the World’s fisheries involve smaller vessels that do not have conveyor belts or automated processing. For these fisheries, it might be useful to do 3D scanning for catch analysis using mobile units – maybe even smartphones. We are developing a proof-of-concept system called CatchSnap, to test this idea.
CatchSnap will enable catch analysis by 3D scanning using a mobile unit. The 3D scans will be used to estimate the type, number and species of the fish. The images to the right illustrate this. A few seconds of video from a smartphone is converted to a 3D mesh of the fish (bottom), through two intermediate processing steps (top and middle).
Machine learning is required to learn the different types of fish. For this to work it needs to be trained on very large data sets. Synthetic data sets, based on real 3D scans of fish, will be used to train a deep learning system, before continuing the training on real hand-labeled 3D images.
SINTEF is working building a portable high precision 3D scanner that will be used to make virtual reality models of large amounts of fish. With random variations on these models, only relatively few real fish will be required to generate an enormous synthetic data set for use to train a deep learning system. Knowledge from developing the portable scanner will be used when developing a handheld scanner.
SINTEF will have a stand at the Nor-Fishing exhibition 2018 (nor-fishing.no), where we you can try a VR experience game showing how CatchSnap will work. Come take a look! For those who cannot attend, we have a video of the VR experience below.
For more information, contact Martin von Heimburg at firstname.lastname@example.org.