SeaBee Stakeholder Event

Wednesday 13 November 2024, Norwegian Environment Agency, Oslo 

Join us at our upcoming seminar to explore the cutting-edge capabilities of the SeaBee drone infrastructure.  

Over the past four and half years, SeaBee has built up an operational drone infrastructure with a wide range of applications for coastal and aquatic research, mapping, and monitoring.  

By 1 April 2025, the SeaBee infrastructure transitions from the establishing phase to being fully operational with end-to-end solutions for environmental applications as mapping of coastal habitats (i.e. seagrass and kelp forests), estimations of seabird and seal populations, and drone-based assessment of water quality. 

Don’t miss this opportunity to connect with leading experts and learn how SeaBee is revolutionizing coastal research and monitoring. Dive into the future with us! 

The event will be open to both physical and online participation and is free of charge. It will primarily be in Norwegian. 

Please register here: Physical (deadline, 5 Nov 2024) or online (deadline, 11 Nov 2024).  

Venue: Miljødirektoratet, Grensesvingen 7, 0661 Oslo and online (link will be send after registration) 

Drone-Based Mapping and Monitoring of the Coast

Written by Charlotte Synnøve Garmann-Johnsen

On the 23rd of September, The Norwegian Water Association hosted an event on drone-based mapping and surveillance of the coast. The focus of the event was on SeaBee, a new Norwegian infrastructure where drones and advanced sensors can deliver data and results on several applications relevant for water. SeaBee can be used for mapping of diverse types of nature and species in shallow marine areas, counting of seabirds and marine mammals, and can be used to estimate water quality.

SeaBee includes several national institutions, that together contributes with competence within drone flying, analysing pictures with artificial intelligence, handling of data and online visualisation. It also includes acquisition of field data for validating of the drone results.

Firstly, Kasper Hancke, senior researcher at NIVA, The Norwegian Institute for Water Research, gave his presentation. Hancke is leading NIVAs research and development activities related to drones for environmental research and coordinating the SeaBee infrastructure. The project is coordinated from NIVA but have several partner organisations. NINA, The Norwegian Institute for Nature Research, contributes with nature research. NTNU, The Norwegian University of Science and Technology, is working on the engineering part. The Norwegian Computing Center is working on the AI computer. The Institute of Marine research is working on the marine mammals. GRID-Arendal is working on the communication. In addition, there are two industry partners contributing to the infrastructure: namely Tiepoint and Spectrofly. The project is funded by The Research Council of Norway.

“The team of SeaBee is doing something no one has done before, combining the knowledge and the end users in marine biology, with engineers and computer scientists, putting all of their expertise together is really a key point in developing a novel toolbox,” says Kasper Hancke.

Anders Gjørwad Hagen, senior researcher at NIVA, elaborated on ground truthing, which is checking that the area being is monitored matches up with the drone imagery. This creates the perfect combination of human and machine.

Robert Dreier Holand from Tiepoint went on to talk about the rules and regulations for flying drones. He explained how you must take a course to be certified, and that they have had a few hundred who has taken their courses.

Afterwards Kristina Øie Kvile, scientist from NIVA talked about how they can use the drones in mapping of seaweed, kelp and eelgrass and monitoring of algal blooms. It can also be used to count seabirds, marine mammals, and shells. . It can be hard for the machines to understand what is on the images, so now they have developed systems that are constantly being updated to help this classification.

Chief engineer Sindre Molværsmyr from NINA went on to talk about the stock estimation of sea birds and marine mammals. Traditionally when they counted sea birds, they would have to walk onto an island and one of the challenges was that the birds could get quite angry. If they compare this with SeaBee, where they use a small drone and a boat to get near the island, and start the drone from the boat, and not entering the island, they see that this disturbs the birds a lot less.  

Then Arnt-Børre Salberg, senior engineer from The Norwegian Computing Center talked about image analysis and use of artificial intelligence. He discussed some of the challenges with AI, but also explained how SeaBee has handled them. On the way forward he says that SeaBee will use so called ground models (the same as ChatGPT), make the annotations easier and use more satellite data.

Second to last, James Edward Sample, senior engineer from NIVA spoke about data handling, sharing and visualisation. One of the main goals of the SeaBee platform is to make SeaBee data available so that members of the public can use them as they wish. He went on to explain how data flows between the various aspects of the SeaBee structure.

Lastly, Hege Gundersen, senior researcher at NIVA and co-lead on SeaBee spoke about the potential and relevance of SeaBee in water and coastal management. She discussed the so far unused potential of SeaBee in monitoring of invasive species like Pacific Oysters, or monitoring for commercial purposes, like in kelp cultivation. Maybe you have a use of SeaBee? Maybe you work in a municipality that knows its species on land, but not in the ocean, or maybe you work in the County administration and wonder how much carbon is stored in blue forests surrounding your area. Maybe you’re a manager or a researcher and have a good idea on how to use SeaBee? 

The potential for using SeaBee and continuing this project is still high and it is with great anticipation we look forward to how SeaBee will be applied in the future.

https://www.tekna.no/fag-og-nettverk/miljo-og-biovitenskap/bio-og-klimabloggen/dronebasert-kartlegging-og-overvaking-av-kysten/

Joint Formentera Drone Mapping Campaign

Written by Charlotte Synnøve Garmann-Johnsen

Take-off with green LIDAR sensors mounted on a drone for high resolution 3D scanning of seagrass meadows at Formentera in the Mediterranean. Drone-borne green LIDAR systems are brand new to research and perhaps this is the first time applied to seagrass research in Europe. The LIDAR system is a Navigator delivered to SeaBee by YellowScan in August 2024, mounted on a Tundra2 drone from Hexadrone. A powerful addition to SeaBees drone infrastructure.

An intensive field campaign was taking place in Formentera, Spain, from September the 16th to the 20th. This was a collaboration between SeaBee and two ongoing Horizon Europe projects, OBAMA-NEXT and C-BLUES, aimed at advancing our understanding of marine ecosystems.

For the first time in SeaBee, and perhaps for the first time in European marine research, a drone-borne green LIDAR sensor has been applied for mapping of submerged vegetation and seagrass meadows! Green LIDAR offers a completely new perspective to seagrass studies by allowing the generation of high-resolution 3D models of the meadows at centimeter-scale resolution. This again enables the creation of digital twin models, estimating seagrass canopy height, and compute biomass and carbon stock quantification.

Overall, the campaign focused on mapping and classifying marine blue carbon habitats including seagrass meadows and its ability to store carbon and promote biodiversity among other ecosystem services.

A multidisciplinary team including marine biologists, drone and satellite remote sensing specialist, and data analysts from NIVA, RBINS and IACM was working together to create a comprehensive understanding of these complex ecosystems and map their distribution.  

Senior researcher and SeaBee coordinator Kasper Hancke measuring Ground Control Points (GCPs) at Formentera, for accurate positioning of drone and satellite orthomosaics

‘The main goal of this campaign was to develop and evaluate methodologies for mapping and monitoring seagrass and other submerged coastal vegetation. In particular, the focus was on integrating satellite and drone data using AI predictions intercalibrated from ground truth data’- Kasper Hancke.

To address this, RGB and multispectral image data from drones and satellite were collected, along with underwater ground truth data and ground control points for remote sensing data validation and referencing.

A drone ground control unit showing the drone mission status, and the images recorded by the drone, here a seagrass meadow and a coastal rock.

Drone pilot Robert N Poulsen and SeaBee coordinator Kasper Hancke planning a drone mapping mission (left) and a DJI M350 drone just before take-off on a mapping mission at Illetes, Formentera.

The images give us information about seagrass distribution, canopy height, carbon storage and other relevant factors for monitoring the seabed.

Take-off with a DJI M350 drone with a high resolution RGB camera onboard (DJI P1 camera). The same drone was also applied to fly multispectral missions with a five-channel Altum-PT sensor from MicaSense.

Uncovering the Ocean’s Secrets with Drones and 3D Technology

Drones and advanced cameras are powerful tools that help us understand our aquatic environments in ways that were previously impossible 

With groundbreaking sensor technology, drones can now scan life underwater – from the air. 3D environmental monitoring is the future, according to NIVA researchers. 

Published: 12.09.2024 
Written by: Gunnar Omsted 
Key researchers: Hege Gundersen, Kasper Hancke 

In a time where climate change and human activity increasingly impact our ecosystems, environmental monitoring has never been more important. At NIVA, we are continuously seeking new and innovative methods to improve our work, and right now, drone-based sensor technology is one of the most exciting developments. 

– Norway is already at the forefront. And the latest technology we are now testing is simply revolutionary. We are using laser technology from drones, which allows 3D scanning both above and below water, with a level of detail far higher than before. This gives us a unique opportunity to understand our aquatic environments in a whole new way, says senior researcher Kasper Hancke at NIVA. 

Despite the expensive equipment, significant cost savings can be achieved in environmental management. 

– Compared to traditional methods, we will soon be able to monitor much larger areas with much higher detail and, not least, much faster, says the NIVA researcher. 

This rotor drone with a LiDAR sensor is used to produce digital 3D models of an area with vegetation both in the sea and on land, or a coastline. (Photo: NIVA)
The Danish company Spectrofly is a regular partner in the SeaBee projects. (Photo: NIVA) Drone with drone pilot in the background
NIVA's drone pilot Medyan Ghareeb is testing new equipment. (Photo: NIVA) Winged drone in the air
This rotor drone with a LiDAR sensor is used to produce digital 3D models of an area with vegetation both in the sea and on land, or a coastline. (Photo: NIVA)

Fast and efficient 
It is already well-known that drones equipped with so-called RGB and multispectral cameras can scan large areas in a short amount of time. They provide us with detailed maps and data about water quality, pollution, and the distribution of plant and animal life. For instance, drones can detect harmful algae blooms that may be dangerous to both humans and animals – contributing to faster and more targeted interventions. 

For several years, NIVA researchers, including through SeaBee projects, have monitored everything from water quality to life in and around water, including the sea, coast, rivers, and lakes. The data is used to understand how ecosystems change over time. 

– The latest development is that we are not only mapping distribution – like a two-dimensional image – but now we can also map and scan habitats in three dimensions, both above and below water, says Hancke. 

How? With the help of technology you might already have in your pocket. 

This is the future of environmental monitoring, where technology and research go hand in hand. 

Hege Gundersen 

Using ‘iPhone technology’ 
Although it is not as advanced, some of the same technology is found in newer iPhone models as in NIVA’s cutting-edge drones. In a so-called LiDAR sensor on the phone, infrared laser pulses are emitted that spread across the surroundings. When these pulses hit objects, they are reflected back to the sensor. By measuring how long it takes for the pulse to return, the iPhone can calculate the distance to various objects. This can provide a three-dimensional map of the surroundings in real-time. 

But let’s go back to the air and the NIVA drones. Even though the distances are larger, the principle is the same as the technology in iPhones. By calculating the time it takes for laser pulses to return to the drone from the ground or seabed, the LiDAR sensor on the drone can quickly and accurately create a 3D/depth map of the surroundings. 

Here we see a three-dimensional terrain model of a coastal area near Larvik. The colours show height differences, where red is the highest (maximum height = 30 m). The model is made with red LiDAR, which causes the water surface in the middle of the model to appear flat because red LiDAR cannot penetrate water. With green LiDAR, you also get information about the seabed (bathymetry) and vegetation under the water.

A bathy-topographic model is a 3D representation showing both the seabed (bathymetry) and terrain (topography). The model is based on green LiDAR data and is displayed as a “point cloud,” meaning a large number of data points precisely placed in a 3D grid. The resolution is 8-12 points per square metre, with an accuracy of about 3 cm. 

For environmental monitoring, we also distinguish between green and red LiDAR. Hancke explains that for land-based mapping or vegetation analyses, red LiDAR is usually the best choice. This technology has been available in recent years. For underwater mapping, however, green LiDAR is necessary. 

– Green light has the ability to penetrate the water surface and down through the water. This technology has only recently become available for drones and opens up entirely new areas of application, such as mapping kelp, seaweed, and eelgrass in coastal zones. We can also use it for mapping river channels and lakes, as well as underwater vegetation, says Hancke. 

– It is now possible to map the seabed and species and habitats in shallow marine areas from drones, explains the NIVA researcher.

Researchers in the water Traditional fieldwork is still important. This is how researchers obtain the background data needed to interpret drone images and "train" the algorithms. (Photo: NIVA)
My drone is loaded with… green LiDAR: The future spearhead in environmental monitoring of aquatic environments? (Photo: NIVA)

Three dimensions better than two 
By including a third dimension, the 3D images from the LiDAR provide a much better understanding of the habitat’s volume and spatial distribution. 

– Take, for example, eelgrass beds, which are extremely important for marine biodiversity. The area coverage of a meadow only gives part of the answer to how much of this habitat exists. It is only when the height of the eelgrass is considered that we get a measure of the amount of eelgrass in the meadow, explains senior researcher Hege Gundersen at NIVA. 

This is important knowledge for calculating how much carbon is stored in the eelgrass biomass and how much carbon can potentially be sequestered in the seabed long-term. This allows researchers to assess the value of eelgrass meadows as carbon sinks and calculate how much CO2 eelgrass meadows absorb from the atmosphere. 

From the air, we can clearly see the bright green eelgrass meadows here at Ølbergholmen near Larvik. 

This gives us a unique opportunity to understand our aquatic environments in a whole new way.

Kasper Hancke

Technology + research = success 
After the drones have finished their “buzzing” and the enormous amount of collected data is analysed, machine learning plays a crucial role. 

– Advanced algorithms make it possible to identify patterns and anomalies that may indicate environmental problems. Examples include a sudden reduction in the distribution of an important habitat or the spread of unwanted species, says Gundersen. 

By using machine learning, researchers can also predict future trends and potential threats, enabling preventive measures to be taken before problems escalate. 

– Understanding and monitoring our aquatic environments is crucial for preserving them. With drones, new sensor technology, and machine learning, we have powerful tools that help us get to know our aquatic environments in ways that were previously impossible. This is the future of environmental monitoring, where technology and research work hand in hand towards a sustainable future, says Gundersen. 

Original article posted by NIVA

This article has been translated from Norwegian 

KELPMAP – Upscaling drone-based maps using satellite images shows promise

Helgelandskysten is one of Norway’s most beautiful coastlines, holding World Heritage Status (UNESCO) and ‘Outstanding Universal Value’.

It has thousands of small islands, islets and skerries, mountains, fjords and a great deal of life above and under the water. It must be managed well to safeguard the valuable cultural history and ecosystems. Kelp forests are a key part of the ecosystems in this area. 

Helgelandskysten (above) and kelp forest (below). Photo from H. Gundersen, NIVA.

KELPMAP (NIVA and NR) is investigating if it is possible to map kelp forests using drones, and then upscale the information collected using satellite data. The project is financed by Miljødirektoratet and Norsk Romsenter, 2022 – 2024 (NIVA field report).  

SeaBee is being used to map the benthic habitats. Both rotor drones and fixed-wing drones with RGB and MSI sensors (SeaBee equipment) are used to collect the data which is uploaded to the SeaBee Data Pipeline.

Ground truth data are also sampled in the field, for training of algorithms and validation of data products. The drone images are then annotated, guided by the ground truth, to show which parts of the drone images are kelp forest and other species.

SeaBee has defined 42 different habitat classes in three analysis levels (shared on GitHub), many of them compatible with the NiN classifications. These results are analysed, quality controlled and presented as high-resolution maps of habitat classes for the whole study area – which show the kelp forests and other species. 

Another new aspect is upscaling these results to cover a larger area using satellite imagery (using Sentinel-2 with 10m resolution and Pleiades with 2m resolution). The upscaling showed promising results, important for environmental management on larger scales.  

The results will be delivered to Miljødirektoratet this year to be used as a tool for measuring progress and implementation of existing national policies and management plans.

Miljødirektoratet are pleased with progress so far, and with the results that were received. These advances also have potential applications in other national and European research, not the least the Kunming-Montreal Global Nature Agreement (CBD). 

The drone view of Helgelandskysten during field work. Photo from G. Medyan, NIVA.

SeaBee shines at GeoHab 2024

SeaBee Research Infrastructure team members proudly presented at the international GeoHab Conference 2024 (6th – 10th May) in Arendal, Norway.

SeaBee contributed four presentations, and was mentioned in the keynote by Terje Thorsnes (NGU) on marine habitat mapping programs in Norway.    

Kristina Kvile (NIVA) presented the latest news on drones, updated methods and protocols for marine habitat mapping. She shared how habitats are classified and how drones can be used to identify marine vegetation at various hierarchical levels ranging from habitat classes to species level using high-resolution multispectral sensors and AI classification tools.

Collecting direct observations from the ground and boat (above) and the possible resolutions of collected data. Photos from K. Kvile (NIVA).

Håvard Løvas (NTNU) told the audience about hyperspectral solutions for detailed identification of shallow-water species and objects with distinct optical ‘fingerprints’.

Hyperspectral imaging is a novel, powerful tool for coastal drone mapping. You can characterise water quality, classify bottom substrates, and identify simple species based on their optical characteristics. However, interpretating images of underwater features and species is challenging, thus careful, systematic data handling and calibration routines are essential to achieve good results.  

Øyvind Tangen Ødegaard (NIVA) described how to tame a surface drone (Otter Pro USV), and how its sophisticated on-board optical  and acoustic sensors can support the information from flying drones to produce even better maps and classifications of seafloor species and features.

The SeaBee Otter operates with a high-accuracy positioning system (better than 2-3 cm on measured data) which enables precise georeferencing of collected data – be it underwater photo and video,  acoustic data of the seafloor or optical data of the water quality. 

Views of how the SeaBee Otter is handled in the field. Photo by Ø. Odegaard (NIVA).

Kasper Hancke (NIVA) evaluated the use of drones and artificial intelligence for kelp forest and seagrass mapping, highlighting the main advantages of SeaBee Research Infrastructure as well: 

– Drones with RGB and MSI sensors combined are powerful tools for high resolution mapping and monitoring of coastal habitats and species 
– Artificial Intelligence models are a powerful method for classifying benthic habitats and are cost-effective
– High resolution habitat maps are essential for coastal carbon accounting and more sustainable management and for guiding marine preservation and restoration initiatives. 
 

Great days of science and exciting discussions. Thank you to the organizing committee! – Kasper Hancke, SeaBee project coordinator

For more information on SeaBee Research Infrastructure visit seabee.no, or follow us on LinkedIn: SeaBee Research Infrastructure.

 

SeaBee Out and About

The SeaBee experts have been busy, out and about sharing how SeaBee Research Infrastructure can be used, testing new possibilities and implementing the SeaBee Data Pipeline in coastal research and environmental monitoring activities.  

SeaBee at C-BLUES EU kick-off meeting   

There is a lot to discover about blue carbon ecosystems – seagrass meadows, tidal marshes, mangroves and macroalgae. The C-BLUES project aims to significantly advance knowledge and understanding of blue carbon ecosystems to reduce scientific uncertainty, improve reporting of blue carbon, and promote the role of blue carbon in delivering climate policy commitments.  

At the kick-off meeting (held 14-17th April in Barcelona), SeaBee coordinator, Kasper Hancke (NIVA), presented how SeaBee and drones can contribute to efficient mapping and monitoring of blue carbon habitats across a range of coastal environments and how drone data further can be developed into tools for assessing:

  1. ecological status of coastal systems,
  2. species and biomass of marine vegetation,
  3. development for estimating stocks and content of blue carbon, with relevance to sustainable management and research on climate regulation. 

C-BLUES will join forces with the already running Horizon Europe project, OBAMA-NEXT.  

C-BLUES is a Horizon Europe Framework project running from 2024-2028, funded under the call for EU-China international cooperation on blue carbon (HORIZON-CL5-2023-D1-02). 

Group picture at C-BLUES kickoff meeting, 14th – 17th April in Barcelona.
SeaBee at remote sensing seminar at MDIR 

Miljødirektoratet (MDIR) hosted a seminar on remote sensing for Norwegian environmental monitoring and mapping on the 23rd April, 2024.

SeaBee co-coordinator, Hege Gundersen (NIVA), presented the KELPMAP project on using drones for mapping and identification of kelp and underwater vegetation.

KELPMAP develops novel tools for kelp forests mapping using drone products and machine learning-based image classification, and the results demonstrate that drone images can be used to identify klep forests to a water depth of at least 10 m.

MDIR and Norwegian Space Center funded SeaBee to developing the tools to map kelp forests and other habitats for improving marine management actions. 

For more information on SeaBee Research Infrastructure visit seabee.no, or follow us on LinkedIn: SeaBee Research Infrastructure.

 

New SeaBee Pilots Ready to Fly

Preparations are well underway for the main field season in SeaBee for 2024. 

SeaBee Research Infrastructure relies on drone-based data collection by experienced pilots and marine biology experts. Ensuring drone pilots are trained in the latest techniques and are up-to-date with new regulations and certifications is a key part of collecting data for mapping, monitoring and research along the Norwegian coastline.  

Drones-eye-view of the NIVA team getting ready to learn new skills in drone flying at Ølbergholmen,Larvik.
Drone Pilot Training 

At the end of March, a team from NIVA headed out to Ølbergholmen (Larvik) to train new pilots, and test the latest sensors and new EVO drones (New EVO drones added to SeaBee Family)  

“We had bright conditions, around 10°C, so relatively good for March, but the weather forecast was not so accurate, and there was more wind than we would have hoped for and expected. However, this was not a show-stopper, as the very lower levels of air were calm enough (less than 10m/s). Our larger drones were able to map at 60m. During lulls in the wind, we were able to test one of our NEW EVO R VTOL (Vertical Take Off and Landing) fixed-wing vehicles” – Medyan Ghareeb, Drone Operations (NIVA) 

  

Kristina Kvile (Data Validation lead) and Debhasish Bakta (Web interface development) went through the training manual and the practical procedures around drone flying. Both did exceptionally well flying two DJI Mavic Mini drones, with skills and confidence increasing throughout the day. Kristina and Debhasish are now certified for A2 and A1/3  (respectively). They were supervised by SeaBee pilots Medyan and Øyvind, who later ran some test flights. 

A panorama-like view of the field site where the team went through practical certifications and tested new SeaBee drones.
Testing new drones and sensors 

Øyvind Herman Torp (Drone Pilot) used the breaks between high winds to test the ReefEdge Blue, Altum PT and RGB sensors. He collected datasets using these covering the whole Ølbergholmen area at an altitude of 60 m.  

The wind subsided a little in the afternoon, and some test flights using the EVO R with RGB and Rededge-P dual setup were flown. The pilots were able to collect a complete data set with the Rededge-P sensor, but challenges with the RGB camera means another attempt is needed to fully test both EVOs and collect a full dual sensor data set. 

For details about the technology we use, visit SeaBee Equipment.

Nuts and Bolts of drone technology in SeaBee

The SeaBee Research Infrastructure relies on using state-of-the-art drones and sensors systems to collect data. Anders Gjørwad Hagen (NIVA) is responsible for ensuring that the SeaBee drones are fit for purpose and carry suitable sensor systems for collecting data in a variety of situations and for multiple research purposes. The SeaBee project has been running for nearly four years, and now has multiple drones being regularly used by expert pilots in the field. 

One of the EVO fixed wing drones prepared for takeoff by SeaBee crew

Identifying user needs 

Ensuring the SeaBee drones are fit for purpose means that user needs must be identified and understood. This was an important task to carry out already from the early stages of the Research Infrastructure development. In SeaBee, we consulted with partner organisations working on the SeaBee applications (What we do). At project workshops, meetings and during field missions, their needs for drone capabilities and sensors were collected and mapped against available technology from industry suppliers.  The first discussions on what kind of drone technology is needed to collect high-quality data were at the kick-off meeting to SeaBee in Oscarsborg fortress, Viken in 2020. 

These requirements were documented and kept in living documents, which are regularly updated, and have since been referred to when purchasing drones, sensors and the associated systems. To date, SeaBee Research Infrastructure has: 

  •  14 flying drones (UAV) 
  • two unmanned surface vehicles (USV) 
  • two underwater drones (UUV) 
  • nine types of sensors 
  • and many software systems for various tasks.  

These are regularly used for field work and data collection. 

The drone and sensor capabilities needed in the field are often updated at project meetings using feedback collected from scientists working in SeaBee, who are responsible for validating the collected data. 

Read more: Two New EVO drones added to SeaBee family 

Field work in Finnmark (Ascomap project)

SeaBee workshops 

There are two physical workshops that house the SeaBee drones and sensors. The workshops are at NIVA in Oslo, and NTNU in Trondheim. The NIVA workshop has been running since late 2022, and the workshop at NTNU since May 2021, with a DJI drone with AFX10 hyperspectral camera.  

These locations were chosen for the physical workshops because they are the main nodes for the drone operations in SeaBee, and can reasonably easily cover the geographical needs of the partners. The workshops store drones and equipment, and are where the drone pilots carry out routine maintenance. 

The field of drone and sensor technology is advancing rapidly. Therefore, purchasing decisions should be carefully deliberated to ensure that you are acquiring cutting-edge technology, rather than investing in solutions that are either not fully developed or that will soon become obsolete. Also, its equally crucial to ensure that the purchases align well with the researchers’ needs.  – Anders Gjørwad Hagen, NIVA 

 

Next Steps 

Although the SeaBee drones and sensors are now in regular use, that does not mean that we have all equipment that might be needed for new and different applications in the future. Therefore, selecting technology and equipment that has multiple potential applications is important to consider. Our latest purchase is a UAV bathymetry (green) LiDAR and a red LiDAR. The green LiDAR is a system capable of collecting bathymetry data down to two Secchi depths in the intertidal zone which cannot be reached by boat. This system can be used for river and lake analyses, a potential new area for SeaBee activities. 

As we approach the end of SeaBee’s establishment phase, it’s important that we make strategic purchases that provide adequate coverage for the (known and unknown) future needs during the operational phase. Therefore, discussions with all partners are planned to ensure good choices of technology are made.