Increasing flight range and monitoring capacity – Two New EVO Drones added to SeaBee family!

At last, a long-awaited, new arrival has come to SeaBee. SeaBee has received two new EVO drones.

The new drones will increase SeaBees operational area limit from a few kilometres to tens of kilometres. The drones also increase the ‘payload’ capacity (what equipment the drone can carry) with more, better sensors, which can be mounted all at the same time. This gives more effective flight times. 

SeaBee Drone Operator, Medyan Ghareeb, and Project Coordinator, Kasper Hancke, are very excited to start working with the newest member of the SeaBee family, one of two EVO VTOL drones

The EVO drones are capable of Vertical Take-Off and Landing (VTOL), which means they do not require a runway, and can lift off and land similar to a helicopter. This gives SeaBee a strong advantage in researching, mapping and environmental monitoring along the difficult terrain of the norwegian coastline. Not to mention in mountainous and forested regions of Norway! 

Curious about what data SeaBee has collected so far? Visit geonode.seabee.sigma2.no to see the range of locations and data types.

The EVO drones are state-of-the-art, fixed wing drones, that look like a small plane. The have a wingspan of 269 cm, a take-off capacity of 10 kg and a flight time of up to 3 hours. They are equipped with a ‘standard’ RGB camera (62 megapixels), as well as multispectral and thermal cameras.  

Read more about the technical details at DeltaQuad Evo – DeltaQuad VTOL UAV 

Watch this space for more updates once we get these drones out into the field! 

One of the new EVO VTOL drones in the SeaBee family on display in the NIVA social zone in Oslo.

Using drones for mapping seagrass and coastal carbon inventories

SeaBee is being proven in the field through different applications and subprojects. ZosMap is a SeaBee subproject with focus on developing applications for seagrass mapping and monitoring using drones (Unoccupied Aerial Vehicles, UAVs), high-resolution aerial imaging, and Machine Learning (ML) technology for image analysis and thematic mapping.  

The aim is to explore the possibilities of measuring seagrass distribution, biomass, carbon content, and health status using flying drones. Not only are seagrass meadows essential to sustain marine biodiversity and provide nursery grounds for fish, but they also host large amount of organic carbon bound in living biomass and in the sediments below the seagrass meadows.   

Mapping seagrass mission

In mid-June, a team of NIVA researchers and drone pilots once again went to map seagrass meadows and explore the amount of organic carbon that is stored in these vital coastal habitats.

The fieldwork is part of an annual monitoring campaign with drone missions flown every month. Five researchers from NIVA participated in this fieldwork. In addition, engineering researchers and two Master students from NTNU are involved in the project work.

A bit outside Larvik, around Ølbergholmen in the Oslofjord, ZosMap has a test ground for developing methods and testing out new hypotheses. Two small bays define the test area that hosts both dense and sparse seagrass meadows, rockweed beds, sandy sediments with microalgae, and a few scattered kelp occurrences. 

The two adjacent field sites close to Larvik. Named "Ølberg S" the southern bay and "Ølberg N" the northern bay (named after the peninsula on the outside of the two bays). N59.007, E10.132.
NIVA partners working in SeaBee, in the field and ready to to map seagrass meadows and explore the amount of organic carbon that is stored in these vital coastal habitats. Photo Kasper Hancke.

A 360 degree panorama of the field area in Larvik, taken using a Mavic mini Pro3. Panorama by Kasper Hancke.

Flying drones

Over four days in June the team flew multiple drone missions for collecting imaging data covering both a larger overview of the region (at high altitude) and high-resolution images for detailed studies (at low altitude). To assess the meaning of pixel resolution for seagrass and organic carbon mapping drone mission were flown at altitudes from 20 to 100 m.  

The drones used in the field mission (top and lower right), and the seaweed species that were mapped (lower left). Photo by Kasper Hancke.

A DJI Matrice 300 drone was deployed equipped with an RGB camera (SeaBee Tech) and a multispectral spectral (MSI) camera (Micasense Altum). In addition a DJI M600 drone was deployed with a hyperspectral (HSI) sensor (SPECIM AFX-10). The RGB images will be used primarily for image annotation and provide good true color overview image, while MSI and HIS image data will be used for generating thematic vegetation maps and quantifying seagrass biomass, organic carbon content and ecosystem health status.  

In addition to drone data collection, the team was busy collecting ground truth data using traditional techniques in order to annotate (categorise) image data and to build a database for training of machine learning algorithms. 

Measuring of seagrass using ground-based methods. Photo by Kasper Hancke.

The algorithms will in turn be able to identify seagrass and other coastal vegetated habitats automatically from drone data, in the same way as a smart phone identifies the face of its owner.

In June we found the seagrass meadows of the Larvik region to be both well developed, in good ecological condition and rich in organic carbon hosted in its dense seagrass canopies – Kasper Hancke, NIVA

 

Importance for seagrass meadows 

Seagrass meadows in the Oslofjord are often under strong pressure from anthropogenic stressors, so it was great to experience healthy and live seagrass meadows in these areas. These areas are not only a SeaBee/ZosMap test ground, but are also a well-visited recreational area for beach drifters, picnic groups, kayakers and many others.

Next Steps

Next steps are to analyse the images further in the SeaBee data pipeline. The results will be published when the data and laboratory samples have been processed.

For more information on how SeaBee works with coastal habitat mapping, visit ‘What We Do’

Two weeks, 440 drone missions, 100 thousand images – The SeaBee Research Infrastructure is taking off!

SeaBee is creating a comprehensive national research infrastructure, enabling sharing and use of data, collected with drones, to better understand Norway’s natural environment. SeaBee trains machine learning algorithms to analyse drone data and facilitate automated data analysis. This allows researchers to do their work faster, better, and more efficiently.  

 
SeaBee and SeaBirds – collecting drone images in the field 

Two field teams from NINA were out in May 2023. Led by Sindre Molværsmyr (NINA), the teams mapped sea gull and cormorant breeding colonies along the southern Norwegian coast from the Swedish border to Karmøy (more than ~500 km of coastline). The goal was to count seabirds through collecting drone images of seabird populations. The data collected from the drone images can then be compared with historical datasets, and data collected using traditional methods.

Traditional methods of data collection are people going to the seabird colony and counting the seabirds, or taking thousands of photos with a handheld camera from the window of a small airplane. The seabird population counting work also contributes to SEAPOP (Om SEAPOP – SEAPOP). 

Map of missions flown to map breeding seabird colonies, during field work by NINA. Photo Sindre Molværsmyr

In total, 440 drone missions were flown by two teams during two weeks from 15 – 29th May, 2023, each using a DJI MAVIC 3 equipped with a RGB camera (SeaBee tech). The surveys mostly covered open-nesting species, like gulls and cormorants.

With very nice weather for most of the field mission, the teams collected around 100 thousand drone images of seabird populations during the 440 missions. The largest missions collected around 2000 images to cover larger islands along the coast. Both teams covered about 150 km of coastline each day, one team with their own boat and the other with good help from Skjærgårdstjenesten or other local boat drivers.

The teams covered most known bird populations in the mission area. All missions were logged and added to the map and data visualisations. This will help for next year’s field mission planning, as colonies that were missed this year are easily seen and located in the map visualisations.

 

Our field work and data collection could not have been done at this scale without SeaBee – sending data for processing by pressing enter on my computer in the evenings was very efficient. The amount of data collected would be impossible to analyse manually, it would take many months to analyse drone images from 400 colonies by hand – Sindre Molværsmyr, NINA

Preparing for drone flying mission from the boat. Photo by Sine Dagsdatter Hagestad
Pilot flying drone out from the boat to capture images for counting seabird populations during NINA's field mission in May 2023. Photo by Sine Dagsdatter Hagestad.
 
Large scale data upload

NINA’s field campaign is one of the first examples of the SeaBee infrastructure operating on large-scale data handling, with drone images uploading directly to the SeaBee data server directly from the field at the end of each day. This is a great achievement for SeaBee, which not only saves time in the field, but allows more efficient mapping and monitoring of large geographic areas with better data quality.

James Sample (NIVA) and the SeaBee Data Platform team have worked hard to get to this point (SeaBee Data Platform components and how it works). Using code deployed on Sigma2, an automated script runs every hour to identify, process and publish new datasets uploaded by scientists in the field.

It was an exciting first test of SeaBee as “data infrastructure”. I really enjoyed being able to explore the new mission data published each morning to see what I could find – James Sample, NIVA

Ready for reviewing drone data collected during the day, before uploading to the SeaBee Data Pipeline
SeaBee Data Pipeline

The SeaBee data infrastructure was valuable during this mission, as having data processed in near real-time allowed the field teams to review data and adjust settings and drone flying strategies during the mission, to ensure the best quality drone images were collected.

In total, around 1 TB of raw images from drone flights were collected and uploaded to the SeaBee data repository, which led to around 15 TB of data once it had been processed (Digital Terrain models (DTMs), orthophotos, point clouds, etc).

The SeaBee data infrastructure ran well for processing such a large amount of data. On average, missions took around 90 minutes to process. Missions with approximately 2000 images took around 1 day to process. After processing, the final products were automatically published to SeaBees GeoNode server. Currently, there are more than 500 data resources published on the server.

This project has been an important achievement for SeaBee as it demonstrates how useful a drone-based infrastructure is for seabird research and mapping, and it serves as a good example of how efficient SeaBee can be for large-scale mapping projects and handling of big data. – Kasper Hancke, NIVA

 

Drone image of seabird population, collected during NINA's field work in May 2023. Photo by Sine Dagsdatter Hagestad
Next steps

The next steps for this seabird mapping project include analysing the drone images that have been collected to estimate seabird population numbers. This will involve training a new AI model, and annotating drone images work as bird species were encountered that are not in the current AI models (for more information, check using Artifical Intelligence and drone data). Also, some comparisons will be made between the data collected using drones and that collected using traditional methods.

In terms of what will happen next for the SeaBee data infrastructure, this field mission provided some very useful lessons for processing large scale datasets. Better error handling systems will be developed, and the machine learning from NR will be integrated to further automate the infrastructure.

SeaBee – The complete package

SeaBee and the Norwegian coast 

The Norwegian coastline is more than 100,000 km long. The coast is where people live and enjoy the marine environment, and where industry and society interact in many different ways.  

Society urgently needs better infrastructure to map and monitor where land meets the sea, so we can better understand environmental changes, assess climate impacts and manage the consequences of human activities. This is where the SeaBee Research Infrastructure can play a crucial role.

Flying, floating and underwater drones can revolutionise mapping and conservation of marine environments in Norway. Not only can drones reach areas of the coastline which are inaccessible by people, drones can collect high resolution data which are then efficiently analysed in cloud computing infrastructure. 

 

“SeaBee is a cross-disciplinary project that aims to build a research infrastructure with the aim to provide drone services to research and mapping and monitoring authorities.”  – Kasper Hancke, SeaBee project coordinator, NIVA 

 

Watch the film today! 

What does SeaBee offer?  

  • Detailed, adaptive planning of field campaigns  
  • Expert drone operators with state-of-the-art drone and sensor technology  
  • Collection of drone images and data about vital marine species and coastal habitats  
  • Processing using machine-learning and cloud infrastructure
  • Visualisation and interpretation of the results by scientific experts
  • Provision of quality-assured decision support for coastal management    

SeaBee cooperates with drone and sensor-related companies and provides research opportunities for students.   

  
“The team of SeaBee makes something possible that no one has done before: combining the knowledge and the end users in marine biology, knowing what the management authorities know and all in close collaborations with engineers and computer scientists”  – Kasper Hancke, SeaBee project coordinator, NIVA 

Mapping the Norwegian coast using Artificial Intelligence and drone images

Data Analysis in SeaBee

Artificial Intelligence (AI) and machine learning are discussed widely – especially regarding the role as daily tools in our society. Within research, various sorts of machine learning and AI methods have been in place for some time. However, there are always new ways to apply these tools within research. We implement some of these new applications in the SeaBee Research Infrastructure.

We use AI and machine learning extensively within SeaBee, and for good reason. AI often provides excellent performance for a wide range of vision-based tasks, where specific objects or features in images need to be identified. The data analysis in SeaBee focuses on implementing state-of-the art AI methodology for object detection and mapping of relevant variables (for example, types of kelp, benthic habitats, seals or seabirds).

Arnt-Børre Salberg (Norge Regnesentral) is leading the Data Analysis work, in partnership with NIVA, NINA, NTNU, IMR, and SpectroFly (About SeaBee).

 
Data analysis pipelines in place so far

The data analysis pipeline works in two modes: training and inference. In the training mode, we teach an AI-model from training data. In inference mode, we apply the taught AI-model to perform inference on new drone data. The pipeline is sensor independent and works on RGB, multi-spectral as well as hyper-spectral images. It is neither tuned to specific nature types or animal species. It learns from what is specified in the training data.

” The biggest achievement in our Data Analysis work so far, is the first version of data analysis pipeline for both thematic mapping of coastal habitats and detection of animals”

– Arnt-Børre Salberg, NR

 
How the pipeline works

Drones, flown by SeaBee pilots, collect images and data along the coast of Norway during field missions. The drone data are uploaded into the SeaBee Research infrastructure and undergo pre-processing approaches, such as orthorectification and image stitching. The drone images are then fed into an AI-model that automatically analyses the images. The AI model has been ‘trained’ using training data to recognise which pieces of drone data are needed by the scientists.

The training data

In SeaBee, the training data for the AI models consists of two parts:

  • The drone image itself
  • The annotation supporting the content in the drone images, e.g., bounding boxes locating birds, with corresponding information about species, sex, age, etc., or polygons specifying habitat species for some area.

The annotation is created by people sitting and drawing bounding boxes or polygons around the objects or areas of interest in the image. It is often time consuming, especially if there are for example, hundreds of birds in many images.

However, once enough training data is collected, the AI models can be trained and implemented into the SeaBee model registry. Then the AI models can analyse new drone images – making the analysis much faster and efficient.

The automated data pipelines used in SeaBee. The pipelines are generic with respect to data, use several segmentation and detection models, and allow for partial annotation. Diagram from Arnt-Børre Salberg, presented at NORA Conference, June 2023.

Advantages and limits of using Artificial Intelligence

Advantages

The main benefit of applying AI in coastal management is the efficiency gained in analysing image data, and the ability to scale the analysis pipeline to address any data stream.  

The AI methodology is data-driven and the method is quite generic. Therefore, it can be applied to solve any mapping question. 

 

Current Limits in SeaBee

For Arnt-Børre, the next priority is to address the issue of brittleness that often occurs in AI models. Brittleness occurs when the environment changes in such a way that the computer vision algorithm can no longer recognise the object due to some small perturbation.  

Annotation results from bird-mapping exercise during the Runde field mission (August 2022). The automated pipeline has identified the birds on nests and the bird flying, from the drone image. Image from Arnt-Børre Salberg, presented at NORA conference, June 2023

Brittleness in SeaBee happens when the AI algorithm is not able to cope with new data acquired under slightly different weather conditions, illuminations, or with a new camera. A common response to such brittleness is to gather more training data, to fill what is thought to be a perceptual gap. This is the strategy we mainly follow in SeaBee as well, as drones can collect more good quality training data very efficiently. We expect the AI algorithms to improve in performance as more and more drone images are collected, annotated, and applied as training data.

 


Successful implementation of state-of-the-art machine learning algorithms in SeaBee is dependent on graphics processing units (GPUs) and fast-working storage provided by the UNINETT/Sigma2 high performance computation infrastructure.  

SeaBee at GeoHab Conference

SeaBee experts attend this year’s international GeoHab conference (Marine Geological and Biological Habitat Mapping) at Réunion Island.

They gave two talks and contributed to a poster.

How we use drones to map under-studied regions

Kasper Hancke (NIVA, SeaBee’s coordinator) presented the SeaBee research infrastructure and how SeaBee uses flying and surface drones to map shallow water habitats in the coastal zone. Kasper focused on the narrow stretch of coastline that hosts vegetated habitats. It stretches from the upper intertidal line to the bottom of the euphotic zone (10-30 m water depth). This is a zone that is under-studied in most mapping and monitoring programs due to the shortcomings of traditional data collection techniques, however drone-based assessments are ideal here.

Kasper Hancke in action, presenting SeaBee at GeoHab conference, 8th – 12th May 2023
A selection of current and future applications of SeaBee research infrastructure, presented by Kasper Hancke at GeoHab conference 8th - 12th May 2023.

 

It was a great pleasure for us to share and discuss our current work on developing drones, sensors, AI/ML analysis tools, and data visualization solutions – Kasper Hancke, SeaBee Coordinator 

 

How SeaBee supports research and innovation

Hege Gundersen (NIVA, SeaBee’s co-lead and project lead for KELPMAP) presented the first results from the KELPMAP innovation project. KELPMAP develops novel tools for kelp forests mapping using drone products and machine learning-based image classification. KELPMAP is funded by the Norwegian Environment Agency. It collaborates with SeaBee on kelp forest habitat mapping, and aims to improve marine management actions.

Kasper Hancke presented a poster on behalf of the MASSIMAL project team (led by Martin Skjelvareid, funded by the Norwegian Research Council). The poster showed results from using hyperspectral imaging for benthic habitat mapping. 

Poster:

Mapping seagrass and rockweed habitats using UAV hyperspectral imaging and machine learning

View over Vega archipelago (above) and different steps of the mapping process (below) used in the KELPMAP innovation project. The protocol used by KELPMAP (and above illustration) is developed in collaboration with the SeaBee and OBAMA-NEXT projects. Pictures from presentation by H. Gundersen at GeoHab conference, 9th May 2023.

GeoHab

GeoHab represents an international group of people dedicated to marine habitat mapping. GeoHab gathers ~150 experts, students, stakeholders, and industry partners from around the world for an annual conference. The GeoHab conference was held from 8th – 12th May 2023 on Rèunion Island. The conference proceedings can be found here, open access.

Sustainable and efficient research support

SeaBee at NIVA Sør´s 40th jubilee

NIVA Sør celebrated recently its 40th jubilee. Several SeaBee experts attended, and among them Øyvind Tangen Ødegaard, a senior engineer at NIVA, and one of SeaBee’s skilled drone pilots/technicians.

Øyvind presented SeaBee, highlighting how drones provide sustainable and efficient support for coastal research in Norway. Drones can be equipped with sensors that collect images and multiple types of data. Data collection using drones is a key part of the SeaBee drone-based research infrastructure.

Øyvind describes the different types of drones (flying, floating and underwater) that can be used to support research (left), and how drones can support coastal mapping and monitoring (right). Photo taken 27th April, NIVA Sør Jubilee, Grimstad.
Drone advantages

Drones offer several advantages over traditional field data collection methods, such as:

  • The ability to collect data in real, or almost real-time
  • Being cost-effective to use, and having less impact on the environment
  • Covering a larger area and reaching more difficult-to-access field sites
  • Allowing communities to collect data in a cost-effective and accessible way that supports citizen science

‘Drones can be particularly useful for regional monitoring and research, as they offer a sustainable and efficient way to collect data over large areas of coastline’ – Øyvind Tangen Ødegaard.

 

Traditional methods still valid

However, data collected by drones still needs the original context of what was happening in the field at the time. Context allows the data to be analysed and used in the best possible way, and helps with quality assurance. Therefore, there is still a need for traditional field methods.

It takes time to integrate drone sensors, and to automate collecting and processing data. Integration and automation are guided by traditionally collected data. The machine learning algorithms used for automating data processes are often trained and quality controlled using data collected with traditional methods (see Drone Data Product Validation for details on how this is done in SeaBee).

Øyvind explains the features of the SeaBee maritime robotics Otter during a break at the NIVA Sør jubilee. Photo taken 27th April, NIVA Sør Jubilee.

By using drones, scientists working with coastal management can reduce costs, limit environmental impact, and enhance research efforts.

For more information about SeaBee drones: SeaBee Tech

Seeing Seals with SeaBee – exploring drone use in counting seals

How many coastal seals are there?

Where are they along the coast?
Which drone and sensor combinations are suitable for counting them?

These are questions that part of the SeaBee team, led by Martin Biuw at Institute of Marine Research (IMR), are working to answer. Together with scientists and drone pilots from NTNU, they are mapping and monitoring seal populations in different places along coastal Norway.
 
Seals and Drones

Harbour and grey seals (collectively referred to as ‘coastal seals’) play an important role as top predators in the coastal marine ecosystem and interact with coastal fisheries. Such interactions can sometimes lead to bycatch (i.e., seals being caught and drowned in fishing gear), and reliable estimates of population size are critical for assessing whether the level of bycatch is within sustainable limits for seal populations or not. IMR monitors seal population numbers regularly to provide sound management advice to the authorities (such as NFD).

While small drones can be excellent for counting seals in relatively small and distinct colonies, it can be very challenging in large archipelagos where seals can be found spread out any of a large number of skerries. In such areas, a better approach may be the use of larger drones that can operate autonomously for longer periods, covering larger areas.

The field campaign

Tarva is a group of islands off the coast of Trondelag which is home to a variety of diverse marine species, including seals. It is one of the areas IMR surveys regularly and is near Seabee colleagues at NTNU. Therefore, Tarva is a good location to explore the use of new drone technology for counting seals.

A recent field campaign used a Vertical TakeOff and Landing (VTOL) Mugin-2 Pro 2930 CF drone, with a variety of sensors (SeaBee Tech). The team used this kind of drone here as it is the most practical solution where there are no flat areas for take-off/landing. 

“We managed to successfully carry out relatively complex operations involving innovative drone technology, thereby showing it is feasible to collect data over much larger areas than we could with traditional small drones”

– Martin Biuw, IMR 

The MUGIN drone at the coast, ready for the next mission (Photo by Martin Biuw).

During this field mission, the flying height was 200m, a compromise between achieving sufficient spatial coverage within the time restrictions, and the ability to detect seals. The results from the flights gave some useful insight into how the sensors functioned.

While the flight operations worked very well, we found that the image resolution was insufficient for reliably detecting seals.  Many seals could be seen relatively clearly, but it was sometimes difficult to distinguish a seal from the background in the collected images. Higher image resolution and using an infrared (IR) sensor synchronised with the RGB sensor would improve this. It would be easier to see the seals in the images, and seals give off more heat radiation than the background.  

One result of using autonomous drones covering large areas is that we get lots of images, and many of these may not have any seals at all. Manually examining all images is extremely time-consuming. SeaBee is developing a machine-learning system to automatically detect seals in images, which will save a lot of time (led by our partners, Norwegian Computing Center) – Martin Biuw, IMR 

Example picture showing seals laying on rocks at Tarva, jpg picture taken at 150m altitude. Shutter: 1/2000s, 2839x2747 pixels. (Photo from SeaBee).
What comes next

Overall, this field campaign showed promising outcomes for using these drones to detect seals in areas of the coast that are difficult to access. The SeaBee infrastructure gave us access to the latest in drone and sensor technology, as well as expert knowledge on seal behaviour to collect the best population estimates at challenging locations. 

Next, we plan to repeat the tests at Tarva, using the same drone but with a better sensor system (higher image resolution and infra-red sensor), further develop and refine the automated seal detection algorithm, and develop a strategy for implementing the wider-ranging drone operations across a range of important seal localities along the coast for regular monitoring. 

 

For more information (in Norwegian): 

SeaBee Data Platform – Satisfying sense of progress

The data storage and sharing component of the SeaBee infrastructure (the SeaBee Data Platform) is where we design and implement cloud-based data storage, and data sharing applications (based on GIS-visualisation tools).

It is for scientists and other experts in, or with, the SeaBee team to upload, process and share data collected from drone missions, all in one place. All datasets on GeoNode, one part of the data platform, are publicly available and exposed as Web Mapping Services (WMSs), so others can add them to their own maps (for example, in GIS).  

“We’re building an Open Source, centralised platform for efficiently processing data from many types of drones. We’re also making a big effort to build upon existing research infrastructure (NIRD) and to keep everything as open as possible.

– James, leader of the team developing this component. 

Several different challenges were overcome by the SeaBee team working on this – from the pandemic situation to adapting new technologies that meet user needs. Now, all the basic components for the SeaBee Data Platform are in place, which is crucial to the success of this phase. 

Getting this to work is one of the biggest challenges, but we are working towards a bigger contribution to the Norwegian research community by pushing what’s possible using national infrastructure”

– James, leader of the team developing this component. 

SeaBee Data Platform components
  • Data storage (using MinIO to store the data collected) 
  • Upload interfaces (integrated with Drone Logbook to extract metadata for drone missions) 
  • Four state-of-the-art workflows: 
    • Orthorectification (using Open Drone Map) 
    • Annotation (using ArcGIS Pro/Image analyst) 
    • Machine learning (with Norges Regnesentral, using Convolutional Neural Networks and PyTorch) 
    • Publishing (using GeoServer and visualised by GeoNode) 
  • Documentation (stored on GitHub) 
Simplified workflow showing the main parts of the SeaBee Data Platform (hosted by Sigma2, the national e-infrastructure for data science in Norway).

All the code is Open Source and available on GitHub. The next steps will focus on standardising workflows and processing the backlog of data, as well as increasing automation (towards a production environment).  

James Sample (NIVA) has recently stepped up to lead the development of this component, taking over from Kristoffer Kalbekken (NIVA). Speaking on the recent developments on the SeaBee Data Platform, James says: 

Over the past two months the SeaBee Team, supported by Sigma2, has worked really hard to get all the core components of the platform in place. It’s exciting to see what can be achieved using the national e-infrastructure and it’s great being able to process mission data more efficiently… The sense of progress is satisfying”. 

Screenshot of the GeoNode section of the SeaBee Data Platform (5th March 2023), showing the toggle swipe functionality on data from a recent SeaBee mission on the coast of Norway.

The SeaBee Data Platform is hosted by Sigma2 on NIRD (the National Infrastructure for Research Data). It currently uses the following resources shared between different platform components: 

  • 64 Central Processing Units (CPUs) 
  • 2 Graphics Processing Units (GPUs; NVIDIA Tesla V100-SXM2-16GB)  
  • 200 GB memory  
  • 10 TB storage (~5 TB used)  
More information 

New method using drones to investigate beach deposits

In January, members of the SeaBee team published a scientific article describing new methods on quantification of mass and carbon content in seaweed and seagrass beach deposits using SeaBee drones and research infrastructure. The new paper by Li et al (2023) describes the new method. It is a cost-efficient method using drones and automated image analysis. 

From Kasper Hancke, the project coordinator and one of the authors: “SeaBee here provides a novel method using drones and photogrammetry to assess and quantify deposits in the beach zone” 

Three highlights from the article are: 

  • Unoccupied Aerial Vehicles (UAVs) provide a powerful tool for monitoring and management of beach wrack deposits.
  • UAVs perform better for wrack volume estimation than conventional manual methods.
  • Total Organic Carbon and Total Nitrogen in wrack range from 4.3-9.7 and 0.3-0.5 kg m−1 coastline, respectively.

Marine organic beach deposits (beach wrack) represent a source of mass, carbon and nitrogen. Beach wrack can be used in industrial products, and does impact recreational activities (for example, sunbathing, swimming, kayaking etc). It can provide food for coastal wildlife (lots of birds and other animals feed on these deposits). Lastly, these beach wrack deposits are important in the regulation of atmospheric CO2, which can impact climate regulation and ‘blue carbon’ budgets.  

You can read the full article in the Journal of Environmental Management here: Quantifying seaweed and seagrass beach deposits using high-resolution UAV imagery – ScienceDirect 

Graphical abstract from Li et al. (2023). It illustrates the collection of images from beach deposits, the analysis and main results from the study.