Programme and videos

DAY 1: 14 February 2022

08:00-08:30: Registrations on site

08:30-09:00 : Welcome speeches

  • Dr. Vladimir Ryabinin – IOC Executive Secretary
  • Mr Mariusz Lewicki – Ministry of Foreign Affairs
  • Prof. Jan Marcin Węsławski – Director of IO PAN
  • Dr. Sergey Belov – IODE co-chair

Conference chair: Mr Taco de Bruin (IODE co-chair)

SESSION 1: GLOBAL STRATEGIES AND POLICY

1.1      The Global Ocean Data Ecosystem: status and way forward

09:00 – 09:09
[1]The GOOS Observations Coordination Group Data Implementation Strategy and Mapping Global Ocean Network Data Flows
Kevin O’Brien

[abstract]

The Global Ocean Observing System (GOOS) Observations Coordination Group (OCG) is responsible for coordinating activities among the global ocean observing networks, including established networks (Argo, OceanSITES, GO-SHIP, etc) as well as emerging networks (OceanGliders, AniBOS, etc.). Part of that coordination includes work toward improving data flows from the networks to stakeholders and supporting the FAIR and CARE data principles for data discovery, dissemination and access.

The OCG recognizes the data policy and strategy development efforts currently in process, including the recently approved WMO Unified Data Policy and the upcoming revision of the IODE data strategy. In fact, the OCG has been and will continue to be involved in the development and revision of these data strategies. To properly respond to these new requirements, the OCG is developing a data implementation strategy for the global ocean observing networks that will be responsive to the new requirements. An important step in developing this implementation strategy is a complete mapping of the data flows through the OCG global networks. This effort is focused on near real-time data, delayed mode data and metadata flows. Through this data mapping exercise, the OCG has been able to identify gaps in the data flows and also provide recommendations for greater efficiency. These recommendations form the basis of the OCG Data Implementation Strategy.

In this presentation, we will discuss the effort to map the data flows through the global networks, review the gaps and efficiencies we discover, and discuss how this shaped the OCG Data Implementation Strategy. We will also discuss how the data mapping effort can be extended to include IODE and WMO interests, and how the OCG efforts overlap with IODE (ODIS, ODIScat, OIH) and WMO (WIS and WIS 2.0).

09:09 – 09:18 [on-site]
[2]The European Marine Observation and Data Network (EMODnet): A regional best practice towards global data sharing and interoperability
Kate Larkin
, Jan-Bart Calewaert, Conor Delaney, Joana Beja, Thierry Schmitt, Alessandra Giorgetti, Sissy Iona, Dick M.A. Schaap, Henry Vallius, Alessandro Pititto, Antonio Novellino, Patrick Gorringe, Mickaël Vasquez, Helen Lillis and Eleonora Manca

[abstract] The European Marine Observation and Data network (EMODnet) is a EU marine data initiative providing open and free access to in situ marine data, data products and services related to the marine environment and human activities at sea. EMODnet uses common data and metadata standards to produce Findable, Accessible, Interoperable and Resuable (FAIR), integrated, datasets and data products, with descriptive metadata, used by thousands of users across different sectors for many purposes, multiple times.EMODnet’s ongoing centralisation of services will further simplify access to marine knowledge in the coming decade, and its collaboration with Copernicus Marine Service will achieve a fully interoperable European marine data space. EMODnet also has an increasingly global outlook, with an expanding data coverage, e.g., in ocean physics, bathymetry and biology, a growing user community e.g., for European and global assessments, and data sharing beyond European seas, e.g., through the EMODnet PArtnership for China and Europe (EMOD-PACE), in collaboration with the National Marine Data and Information Service (NMDIS), China .EMODnet is a leading community for marine data services and provides technical expertise on data discovery and access to the International Oceanographic Data and Information Exchange (IODE) of the Intergovernmental Oceanographic Commission (IOC), providing a key European focal point for the developing IOC Ocean information Hub (OIH), working with EurOcean, and other actors. In the coming decade, EMODnet will increase its contribution to global ocean data, information and stewardship initiatives, including the IOC OIH and related Working Groups, the Global Earth Observation System of Systems (GEOSS) and GEO Blue Planet initiative, and to activities of the UN Decade of Ocean Science for Sustainable Development. In dialogue with global actors, EMODnet will also increase the impact of marine data for society, e.g., for the EU Green Deal, the World Ocean Assessment, the G7 and the UN 2030 Agenda.

slides of presentations on site available here

09:18 – 09:27
[3]Ocean FAIR Data Services – Two years on
Toste Tanhua
, Sylvie Pouliquen, Jessica Hausman, Kevin O’Brien, Pip Bricher, Taco de Bruin, Justin J.H. Buck, Eugene F. Burger, Thierry Carval, Kenneth S. Casey, Steve Diggs, Alessandra Giorgetti, Helen Glaves, Valerie Harscoat, Danie Kinkade, Jose H. Muelbert, Antonio Novellino, Benjamin Pfeil, Peter L. Pulsifer, Anton Van de Putte, Erin Robinson, Dick M.A. Schaap, Alexander Smirnov, Neville Smith, Derrick Snowden, Tobias Spears, Shelley Stall, Marten Tacoma, Peter Thijsse, Stein Tronstad, Thomas Vandenberghe, Micah Wengren, Lesley Wyborn, Zhiming Zhao

[abstract] Well-founded data management systems are of vital importance for ocean observing systems as they ensure that essential data are not only collected but also retained and made accessible for analysis and application by current and future users. Effective data management requires collaboration across activities including observations, metadata and data assembly, quality assurance and control (QA/QC), and data publication that enables local and interoperable discovery and access, and secure archiving that guarantees long-term preservation. To achieve this, data should be Findable, Accessible, Interoperable, and Reusable (FAIR). In recent decades, ocean data managers, in close collaboration with international organizations, have played an active role in the improvement of environmental data standardization, accessibility and interoperability through different projects, enhancing access to observation data at all stages of the data life cycle and fostering the development of integrated services targeted to research, regulatory and operational users.
As ocean observing systems evolve and an increasing number of autonomous platforms and sensors are deployed, the volume and variety of data increases dramatically. To better serve research, operational, and commercial users, more efficient turnaround of quality data in known formats and made available through web services is necessary. In particular, automation of data workflows will be critical to reduce friction throughout the data value chain. Adhering to the FAIR principles with free, timely and unrestricted access to ocean observation data is beneficial for the originators, has obvious benefits for users and is an essential foundation for the development of new services made possible with big data technologies. In this report we review progress since the OceanObs’19 meeting.

09:27 – 09:36
[4]Transform to OPen Science (TOPS)
Chelle Gentemann
, Steve Crawford, Kevin Murphy, Katie Baynes, Yvonne Ivey, Kevin Ward, Emily Cassidy, Rahul Ramachandran, Manil Maskey, Yaitza Luna-Cruz, Chris Lynnes and Elena A. Steponaitis

[abstract] The Transform to OPen Science (TOPS) mission is designed to rapidly transform agencies, organizations, and communities to an inclusive culture of open science. TOPS activities will help the Ocean Decade achieve a more equitable and inclusive future digital ecosystem and advance ocean science. Open science — opening up the scientific process from idea inception to result — increases access to knowledge and expands opportunities for new voices to participate. Sharing the data, code, and knowledge associated with the scientific process lowers barriers to entry for historically- excluded communities and enables findings to be reproduced. But truly opening up the scientific process requires that we change the cultural norms that are preventing us from truly working together and moving forward. To help catalyze and support change within the community, NASA is championing a new initiative, the Open-Source Science Initiative (OSSI). A key component of OSSI is the Transform to OPen Science (TOPS) mission and NASA’s first decree declaring 2023 as the Year Of Open Science (YOOS). TOPS will be a global community initiative to spark change and inspire open science engagement through events and activities that will shift the current paradigm. Partnering with other organizations, NASA will support and enable the scientific community to move towards open science. Over the next five years, the Transform to OPen Science (TOPS) mission will coordinate efforts designed to rapidly transform agencies, organizations, and communities. These efforts are aligned with recommendations from NASA, the National Academies, and the United Nations Educational, Scientific and Cultural Organization (UNESCO). To guide efforts, TOPS has three overarching goals:
1. Increase understanding and adoption of open science principles and techniques in the science community
2. Accelerate major scientific discoveries
3. Broaden participation by historically-excluded communities

09:36 – 09:45
[5]All-Atlantic Ocean Data Enterprise 2030 – Common Standards For Information And Data Sharing
Olga Sato, Nicolas Dittert, Ana Rei and Nikki Funke

[abstract] According to the IPCC, climate change continues to have a far-reaching impact on the oceans and their coasts. The consequences are profound impact on marine ecosystems. At the same time, the oceans play a key role in the development of national and regional economies, in achieving the Sustainable Development Goals and in addressing climate change. Globally, the livelihoods of many coastal places and the people who live there are under threat. Policymakers should be able to make global decisions based on the best possible knowledge. An essential basis for this is the availability of scientific data. Natural science figures as well as social science facts form the basis for a balanced picture of today’s knowledge about the current situation for future decisions. The agreement signed as the Belém Statement between the signatories Brazil, the European Commission, and South Africa creates the political framework for transatlantic cooperation. One goal of the Coordination & Support Action AANChOR (https://allatlanticocean.org/), which is financially supported by the European Commission, is the formation of an All-Atlantic Ocean Data Space. As a representative of its five core activities, we present the Joint Action “All Atlantic Ocean Data Enterprise 2030 – Common Standards for Information and Data Sharing”. Behind this title is the task: Create a common All-Atlantic Ocean Data Space, in which all interested countries of the South and North Atlantic jointly undertake the effort
to collect, archive, and share historical and current data transparently, free of charge. This initiative is guided by the principles for data management known as Findable, Accessible, Interoperable, Re-usable data (FAIR) and Collective Benefit, Authority to Control, Responsibility, and Ethics (CARE).

09:45 – 09:54
[6]Unleashing the power of data with blockchain
Aldo Drago

[abstract] A lot has already been said about blockchain. Yet it is still a big unknown even with established professionals. The potential of blockchain may not yet be visioning its wider range of influence and impacts that it will trigger in unknown realms of application, revolutionizing the business world and especially the way we exchange data. Data and information in the digital age are spearheading the evolution of services and product development, serving a continuum of user demands at all levels and scales, and boosting research and innovation applications. Indeed it is recognised that data is today a key ingredient for competitiveness, and this is the supra exponential trend for the immediate future. Blockchain comes to our aid here, providing a solution to curb the demand of emerging data-hungry users, promising to unleash the full power of data. The Blockchain protocol is evolving to specifically power decentralised data marketplaces which are equivalent to distributed point to point databases relying on a system of collective mutual verification and qualification. Through the use of data encryption, time stamp, distributed consensus, and economic incentives in the distributed system, Blockchain can provide a system for data transactions without the need of formal mutual trust between parties, providing solutions for data flows at low cost, high efficiency and maximal security compared to traditional data storage and exchange mechanisms. Blockchain motivates what are being coined as ‘data commons’, these being places where data is freely unlocked and pooled to be used and re-used by many, while it carries with it the value and wealth that grows with the data itself, and evolves seamlessly with its usage, bringing back economic returns to both the original data providers, to the data product developers and the service providers.

09:54 – 10:03
[7]The Black Sea Reanalysis System In The Framework Of The Copernicus Marine Service
Leonardo Lima
, Stefania Angela Ciliberti, Ali Aydoğdu, Simona Masina, Romain Escudier, Andrea Cipollone, Diana Azevedo, Salvatore Causio, Elisaveta Peneva, Rita Lecci, Emanuela Clementi, Eric Jansen, Mehmet Ilicak, Sergio Cretì, Laura Stefanizzi, Francesco Palermo and Giovanni Coppini

[abstract] Ocean reanalyses are becoming increasingly important to reconstruct and provide an overview of the ocean state from the past to the present day. We present a Black Sea reanalysis providing major hydrodynamic variables in the period from 1993 to May 2020. In the scope of the Copernicus Marine Environment Monitoring Service (https://marine.copernicus.eu/) the Black Sea reanalysis system is produced using an advanced variational data assimilation method to combine the best available observations with a state-of-the-art ocean general circulation model. The hydrodynamical model is based on Nucleus for European Modeling of the Ocean, implemented in the Black Sea domain with a horizontal resolution of 1/27° x 1/36°, and 31 unevenly distributed vertical levels. The model is forced by the ECMWF ERA5 atmospheric reanalysis and monthly climatological precipitation, whereas the sea surface temperature is relaxed to satellite daily objective analysis fields. The model is online coupled to OceanVar, a 3D-Var ocean data assimilation scheme, to assimilate sea level anomaly along-track observations and in situ vertical profiles of temperature and salinity. The reanalysis shows that the overall surface dynamic topography is well reproduced and it represents the main Black Sea circulation such as the Rim Current and the quasi-permanent anticyclonic Sevastopol and Batumi eddies. The system produces very accurate estimates of temperature, salinity, and sea level which makes it suitable for understanding the Black Sea physical state in the last decades. The temperature trends have indicated the recent faster warming of the Black Sea that has impacted the formation of its cold intermediate layer. This is an important signal of the Black Sea response to climate change. Nevertheless, in order to improve the system quality, new developments in ocean modeling and data assimilation are still important, and sustaining the Black Sea ocean observing system is crucial.

10:03 – 10:09 Q&A
10:09 – 10:19 Health break

1.2       Identifying data and information user needs at the national level

10:19 – 10:28
[8]Coastal-Marine Research and Environmental Management: MSP for Coastal Regions Sustainable Development in Ukraine
Dmytro Cheberkus
, Yurii Tuchkovenko and Sofiia Zherebchuk

[abstract] Application of the ecosystem-based approach presupposes a holistic perspective, continual development of knowledge of the seas and their usage, application of the precautionary principle, and flexible management.
Marine Spatial Planning (MSP) provides an important tool for sustainably managing mounting pressures on the living and non-living resources in marine areas and have proven to be effective instruments for environment management and sustainability. Community-based initiatives occur more commonly with smaller coastal regions, where the community of users lives in close proximity to the sea and consequently experience direct impacts and benefits. On the other hand, centralized or top-down management can also be effective, especially in achieving broader biodiversity conservation objectives through the establishment of MSP. In many countries, consultation is required by law or as a matter of policy, with ultimate decision-making power and funding decisions remaining with the government. Both top-down and bottom-up approaches have been criticized for failures to meet conservation objectives and sustain engagement of stakeholders over time. For Ukraine more fruitful will be the bottom up approach. Addressing environmental issues at the local level may be more efficient than at the central level because local communities are more aware of their problem priorities and see a specific benefit in solving one or another environmental issue. The first step to the embody of the MSP in the local level it need to develop and apply a methodology for an interdisciplinary system of indicators for assessment of environmental resilience in coastal zones, and for identification of priorities in sustainable environmental management of natural resources and whole socio-ecological system of coastal zone, including coastal-marine interface. Implementation of such a system at the level of territorial administrative units will contribute to a “bottom-up initiative” on creation of an efficient system for integrated socio-ecological nature and sustainable environmental management in the Ukrainian sector of the Black Sea coastal zone to ensure sustainable development.

10:28 – 10:37
[9]EMODnet Biology: A European initiative with global influence
Joana Beja
, Leen Vandepitte, Dimitra Mavraki, Vasilis Gerovasileiou, Tom Webb, Dan Lear, Bart Vanhoorne, Lennert Tyberghein and EMODnet Biology

[abstract] EMODnet Biology has recently completed a decade of efforts towards making marine biodiversity data openly available following the FAIR principles. The increase in available data and development of data products has been supported by extending the partnership to include not only national organisations that act as advisers to the government on maritime issues, but also academic institutions, Small and MediumEnterprises (SMEs) and by collaboration with other EMODnet thematic lots.
The rapid establishment of EMODnet Biology was facilitated by the existence of EurOBIS, the European node for the Ocean Biodiversity Information System (OBIS). Due to EurOBIS acting as the core data infrastructure, there is a streamlined connection in terms of data sharing with other international biodiversity initiatives, like OBIS and GBIF (Global Biodiversity Information Facility).
The use of internationally recognised vocabularies, like WoRMS (World Register of Marine Species- a UN Decade endorsed project), Marine Regions for the geographical data, or BODC NVS2 (British Oceanographic Data Centre Natural Environment Research Council Vocabulary Server) for the extended measurements or facts and standards (Darwin Core) allows for improved interoperability with other (types of) data.
A future implementation of fitness for use criteria will allow users to select the best data for their purposes (e.g. presence/absence, quantitative analysis or modelling), which can then be downloaded via the portal or OGC web services.
The data products are created with our stakeholders in mind and target not only EBV/EOVs but also specific MSFD indicators. The next few months will see all EMODnet thematic lots merge into the Central Portal, but these changes won’t affect the work we have done and will continue to do. EMODnet Biology, with more than 25 million occurrence records, is one of the main OBIS data providers, and our data contributes to policies and (inter)national marine biodiversity assessments.

10:37 – 10:46
[10]Creating a sustained climate ocean observing system for societal benefit and risk awareness: the AMOC-ASAP approach
Johannes Karstensen
, C. Bearzotti, J.-B. Calewaert, I. Castelão, S. Heymans, R. Higgings, K. Larkin, G.D. McCarthy, A, Novellino, A. Oliveira, S.M. Olsen, A.M. Piniella and K. von Schuckmann

[abstract] The benefit of systematic and sustained climate observations is evident in light of the rapidly increasing direct impact of climate change on our lives. The ocean observing value chains on climate issues (e.g. heat transport, tipping points, ecosystem changes) are composed by multiple elements such as: nationally/internationally funded ocean observations, data management and data integration, and specific means for creation and dissemination of products. AMOC-ASAP is an international effort that will analyze the ocean observing value chains related to the Atlantic Meridional Overturning Circulation (AMOC) or Gulf Stream System, to contribute to the following aspects: establish actionable AMOC indicators, sustain AMOC observing value chain elements, apply AMOC associated knowledge, and prediction of future AMOC evolution (ASAP). Impacts associated with fluctuations in AMOC are of global importance, and may affect almost all of us – extreme weather, large-scale persistent and also short term (heat wave) temperature changes that can influence crop yields, fisheries, ecosystems’ health, as well as private sector business and more. In this regard AMOC is linked to various of the 17 Sustainable Development Goals (SDGs). We will outline how AMOC-ASAP will operationally connect and make use of existing data infrastructures, such as EU marine data services EMODnet and CMEMS, and with IODE of UNESCO, to evolve the current AMOC data pipelines and move towards fully transparent and FAIR marine data and data products related to AMOC. Routine production of indicators for Atlantic Ocean circulation for policy makers, businesses and society to assess ocean health in the context of the climate emergency. The ultimate outcome of AMOC-ASAP is a sustained AMOC data ecosystem that delivers the authoritative data and serves various known and anticipated users, while also providing inputs from which to retrieve added-value insights on the AMOC’s role in the Earth System.

10:46 – 10:55
[11]Value chains in public marine data – a UK case study
Clare Postlethwaite
, Claire Jolly, Emma Heslop and James Jolliffe

For further details of this work, please refer to the paper here

[abstract] Value chains are a useful concept for mapping the relationship between data production, data processing, generation of data products and use by different groups. This presentation summarises the results of a joint project between the Organisation for Economic Cooperation and Development (OECD), the Global Ocean Observing System (GOOS) and the UK’s Marine Environmental Data and Information Network (MEDIN). The project provided a better understanding of the pathways through which marine data are used and transformed into actionable information, creating systematised value chains for the first time. MEDIN is the hub for UK marine data, and promotes sharing, re-use and improved access to marine data. Within the MEDIN framework, data are managed and delivered by a network of specialist data archive centres, covering a range of marine data types. Information was collected about the users and uses of marine data from these public data repositories via a specially developed online survey. The survey identified the different sectors of the UK economy benefiting from data made available in this way and found a diverse user- base, with users from multiple industries of the UK’s ocean economy. The links between different types of marine data and different sectors of the economy are multiple and varied, spawning many complex value chains as a result. The project concluded with a series of recommendations for various actors involved in marine data value chains: funders, the marine observing community and data centres. These include promoting the use and reuse of marine data in policies and supporting the entire marine data value chain; developing communication strategies that clearly outline how the collection and archive of marine data are used and reused throughout the ocean economy; to consider the technical needs of data users. The recommendations from this study will be applicable to public marine data repositories anywhere.

1.3       Global data sharing: changes in data sharing policies

10:55 – 11:04
[12]Governance and Business Models for Data Sharing
Tom Redd
, Gry Ulverud, Martin S Moen, Inge Sandvik, Anna Silyakova, Mogens Mathiesen

[abstract] Ocean data is an important asset for ocean health and ocean wealth. To harmonize sustainable ocean use and environmental protection we need ocean data, collected by various entities, shared openly worldwide. While the principle that taxpayer funded data should be publicly available has driven progress in opening access to science data, this principle does not apply to industry data. However, many businesses now realise the need to transition towards environmentally sustainable models and understand the role of data sharing in enabling this. When speaking to businesses about the barriers to data sharing, several themes are often identified including ownership, fairness of use, trust, ethics, monetization, and risk to competitiveness. At the same time, there is a general willingness to share data, especially for use in science, if these issues are resolved. As identified by the High-Level Panel for a Sustainable Ocean Economy, new market incentives for innovation, new public–private instruments for investment and new business models. In this session, we would like to initiate a discussion of different models for data sharing, especially in relation to using industry data for science. Recognising that there may be different models for science, industry and government institutions is the first step in understanding how data providers can benefit from data sharing, whether that be acknowledgment for those who have gathered data, potential new revenue sources, or demonstrating a contribution towards sustainable development.

11:04 – 11:13
[13]Best Practice Data Standards for Discrete Chemical Oceanographic Observations
Li-Qing Jiang
, Denis Pierrot, Rik Wanninkhof, Richard A. Feely, Bronte Tilbrook, Simone Alin, Leticia Barbero, Robert H. Burne, Brendan R. Carter, Andrew G. Dickson, Jean-Pierre Gattuso, Dana Greeley, Mario Hoppema, Matthew P. Humphreys, Johannes Karstensen, Nico Lange, Siv K. Lauvset, Ernie R. Lewis, Are Olsen, Fiz F. Perez, Chrisopher Sabine, Jonathan D. Sharp, Toste Tanhua, Thomas W. Trull, Anton Velo, Andrew J. Allegra, Paul Barker, Eugene Burger, Wei-Jun Cai, Chen-Tung A. Chen, Jessica Cross, Hernan Garcia, Jose Martin Hernandez-Ayon, Xinping Hu, Alex Kozyr, Chris Langdon, Kitack Lee, Joe Salisbury, Zhaohui Aleck Wang and Liang Xue

[abstract] Effective data management plays a key role in oceanographic research as cruise-based data, collected from different laboratories and expeditions, are commonly compiled to investigate regional to global oceanographic processes. Here we describe new and updated best practice data standards for discrete chemical oceanographic observations, specifically those dealing with column header abbreviations, quality control flags, missing value indicators, and standardized calculation of certain properties. These data standards have been developed with the goals of improving the current practices of the scientific community and promoting their international usage. These guidelines are intended to standardize data files for data sharing and submission into permanent archives. They will facilitate future quality control and synthesis efforts and lead to better data interpretation. In turn, this will promote research in ocean biogeochemistry, such as studies of carbon cycling and ocean acidification, on regional to global scales. These best practice standards are not mandatory. Agencies, institutes, universities, or research vessels can continue using different data standards if it is important for them to maintain historical consistency. However, it is hoped that they will be adopted as widely as possible to facilitate consistency and to achieve the goals stated above.

11:13 – 11:19 Q&A
11:19 – 11:29 Health break

1.4       The future of global databases: what’s next for WOD, OBIS,…

11:29-11:38
[14]Above and Beyond – Completing the World Register of Marine Species (WoRMS)
Leen Vandepitte
, Wim Decock, Stefanie Dekeyzer, Bart Vanhoorne, Andreas Kroh and Tammy Horton

[abstract] The general vision of the Ocean Decade implies we have excellent knowledge of what lives in the ocean. All marine biological science is somehow linked to the core knowledge and correct usage of taxonomy, a scientific means to unambiguously name all marine species, ensuring all existing names are thoroughly documented and linked to their valid name. The aim of WoRMS is to provide an authoritative classification and catalogue of marine names, supported by the published literature, and edited by expert taxonomists globally. . In October 2021, WoRMS was endorsed as an Ocean Decade Project Action, entitled ‘Above and Beyond – Completing the World Register of Marine Species (ABC WoRMS)’, linked to the Ocean Decade Programme ‘Marine Life 2030’. An authoritative and comprehensive list of names of marine organisms may seem a small , implicit part in the Decade mission, yet it is indispensable and fundamental in linking numerous other initiatives. WoRMS is continuously updated to reflect new research results, yet taxonomic gaps still need to be addressed. During the full span of the Ocean Decade, WoRMS will continue its endeavors to be able to provide a full taxonomic overview of all marine life, thereby not only supporting scientists, but also everyone who makes use of species names, including policy, industry and the public at large. New challenges in the field of taxonomy, such as temporary names using Open Nomenclature, will be exploredto ensure the most suitable solution for all the whole WoRMS community is implemented. The documentation of species traits – which are of critical importance for ecological marine research – will be further encouraged, and we will strengthen and improve the functionality ofexisiting links with other global databases, infrastructures and initiatives such as LifeWatch, LifeWatch Species Information Backbone, OBIS, GOOS, COL, DiSSCo, BoLD & GenBank.

11:38 – 11:47
[15]AquaDocs: Ensuring the FAIRness of ocean and aquatic research for the Ocean Decade and beyond
Sally Taylor
, Ekaterina Kulakova, Lisa Raymond, Pauline Simpson, Tamsin Vicary and Jennifer Walton

[abstract] AquaDocs is the new open access thematic repository of the UNESCO/IOC InternationaI Oceanographic Data and Information Exchange (IODE) and the International Association of Aquatic and Marine Science Libraries and Information Centers (IAMSLIC) with support from the FAO Aquatic Sciences and Fisheries Abstracts Secretariat. It covers natural marine, coastal, estuarine/brackish and freshwater environments. It was built from merging the IODE OceanDocs and the IAMSLIC Aquatic Commons and now holds >38,000 records. AquaDocs is provided as a scalable service to those ocean and aquatic organizations, projects and individuals who do not have a repository in which to make their research output permanently stored and publicly discoverable and accessible, in line with FAIR principles.At present more than 130 worldwide organizations deposit into AquaDocs, and it is now becoming recognized as a sustainable, curated repository for documents (and other media) by international organizations and projects such as SCOR (Scientific Committee on Oceanic Research), POGO (Partnership for Observation of the Global Ocean), IIOE-2 (Second International Indian Ocean Expedition 2015-2020), and ICAN (International Coastal Atlas Network). This presentation will outline a proposal that AquaDocs participate as a Decade Project in the Ocean Decade as the host repository for Decade Communities of Practice to use as their permanent record of Ocean Decade contributions. AquaDocs could also be the Archive for the official Ocean Decade documentation. Participants are invited to register on AquaDocs https://aquadocs.org/ and test it’s functionality with the aim of co-designing enhancements with the AquaDocs Team, to ensure AquaDocs is a secure, permanent, fit-for-purpose repository for the Ocean Decade.

11:47 – 11:56
[16]The OBIS we need for the Ocean we want
Ward Appeltans
, Nathalie Van Isacker, Nathalie Tonné, Pat Halpin, Sky Bristol, Eduardo Klein, Martha Vides and Anton Van de Putte

[abstract] For 21 years now, we have been building a central global data platform that provides free access to the world’s ocean biodiversity and biogeographic data. The OBIS system has grown with 20 million records last year and will grow even more rapidly in the coming years with new innovative observing technologies being put in place, such as environmental DNA and automated imaging devices. OBIS is one of the main building blocks for the development of an integrated Ocean Observing System aimed to develop the key indicators to report on the Health status of our Ocean and its natural resources. OBIS is now at the stage where it has invested a lot in rebuilding the OBIS infrastructure and has been able to successfully promote OBIS, which is also illustrated by the growth of new data coming in. However, with the rise in expectations related to the UN Ocean Decade and initiatives under the Global Ocean Observing System (GOOS), Marine Biodiversity Observation Network (MBON) and others, we needed to make sure OBIS and its network can meet those growing needs. We will report on the results of a recent stakeholder and capacity assessment to better understand the expectations and hopes from OBIS by our partners, to revisit a number of those relationships, and discuss the way forward to further shape the OBIS we need by 2030.

11:56 – 12:05
[17]Development of standardized data management for a marine invasive species monitoring plan using environmental DNA sampling in Suva, Fiji
Saara Suominen
, Joape Ginigini, Gilianne Brodie, Paayal Kumar, Pieter Provoost, Matthias Obst, Craig Sherman, Neil Davies, Christopher Meyer, Pier Luigi Buttigieg, Pascal Hablutzel, Nic Bax, Frank Muller-Karger, John Deck, Ward Appeltans

[abstract] Due to the ease of sampling, environmental DNA (eDNA) is rapidly becoming a popular method for invasive species monitoring. However, several issues remain that hinder its uptake in routine environmental monitoring especially in developing nations. Initial laboratory setup can be costly, and specific expertise is needed for sample processing. Data from eDNA analyses require infrastructure and considerable expertise to produce reliable and reproducible results. The Pacific Islands Marine Bioinvasions Alert Network (PacMAN) is a 3-year project, with the aim to develop an eDNA based marine invasive species early detection- rapid response system for the Suva Harbour in Fiji. A major focus of PacMAN will be in developing standard protocols for sampling, sample processing and data management workflows, that can be adopted in the Pacific Islands region at large. The data workflow will ensure standardized and reproducible data management from field sampling to submitting new datasets to the Ocean Biodiversity Information System (OBIS). A web interface will be developed for data submission, while the remaining work will be through an automated pipeline. The bioinformatics pipeline will take raw sequences and related metadata, and package the taxonomically annotated occurrence records to Darwin core archives ready to be submitted to the database. This information will then be used to inform the PacMAN decision support tool. The decision support tool will provide species risk estimates based on habitat suitability modelling and known invasive properties of the detected species. All data products, from the bioinformatics pipeline to the occurrence datasets will be made publicly available through e.g. GitHub (for code and scripts), NCBI (raw sequences), BOLD (specimen biomarker sequences) and OBIS (processed data). The PacMAN project has a strong focus on building long-term sustainability beyond the duration of the project by (i) engaging stakeholders from the beginning of the project because they will need to consume and act on the data products, (ii) organizing training courses to increase local science capacity for eDNA sampling and sample processing, as well as data interpretation by the national authorities in Fiji and (iii) by using stable, global data infrastructures that will be around for a long time after the project ends. PacMAN plays an important role in enhancing biosecurity and conserving biodiversity by piloting new technologies and developing best practices in monitoring biodiversity using novel molecular tools and state of the art data management.

12:05 – 12:14
[18]The World Ocean Database (WOD) Cloud: an international ocean and coastal quality-controlled open data discovery, access, and data ingestion tool
Hernan Garcia
, Tim Boyer and WOD Team

[abstract] The World Ocean Database (WOD) is the largest formatted, quality controlled, global ocean profile database. WOD includes over 17 million casts of historical and modern physical and chemical profiles with measured GOOS Essential Ocean Variables and plankton. WOD is an international data hub. It is a Center for Marine Meteorology and Oceanographic Climate Data (CMOC) in the Marine Climate Data System (MCDS) a joint system of the Intergovernmental Oceanographic Committee (IOC) and the World Meteorological Organization (WMO). WOD aims to include all possible open access oceanographic profile data from around the world including data from the IODE network of National Oceanographic Data Centers and World Data Service for Oceanography. The objective is to make these data easily and quickly discoverable and accessible without restrictions to users worldwide. An important step is to advance the WOD in the Cloud (WODc). WODc would enable greater operational data functionality for data users and providers worldwide. WODc would enable the development of add-on software applications from users around the globe to retrieve, visualize, and/or upload their data directly to the WODc in standardized forms. This would facilitate equital discovery and access to analysis-ready oceanographic data for the UN Decade of Ocean Science and multiple other uses. In cooperation with IODE, we plan to develop a data ingestion tool to operationally facilitate seamless aggregation of data from multiple digital formats into WODc. The WODc data ingestion tool will feed the WOD main database where more rigorous data integrity checks would be performed in delayed mode (e.g., quality control, data duplication, units, metadata, attribution, archival, stewardship, DOI). These data then would also be made openly available via the WOD and WODc. We describe our plans to evolve WODc as an international ocean and coastal data hub resource for and by the international data user and provider communities.

12:14-12:20: Q&A
12:20-13:21: Lunch

1.5       Representation and inclusiveness in the global commons

1.5.1    The small island dilemma: collecting, managing, sharing and using data with minimum resources

13:21-13:30
[19]Autonomous and accessible vessel monitoring for small-scale fisheries
Samantha Cope
, Brendan Tougher and Virgil Zetterlind

[abstract]Monitoring vessel activity in remote locations is a challenge for fisheries worldwide and likely more difficult for small-scale fisheries. While technologies used by industrial fleets may not be available or applicable,minimal resources available for patrolling marine areas and evaluation of collected data also present obstacles to local management. There is a need for monitoring solutions for small-scale fisheries at an appropriate scale without compromising available benefits from existing and new technology. The primary objective of the Marine Monitor (M2) is to provide an accessible vessel monitoring platform for resource-constrained managers. M2 uses marine radar, a familiar technology for vessel operators, to record and report on activity in nearshore areas of concern. Monitoring in remote areas is accomplished by outfitting M2 systems on towable trailers with solar array, battery system, and microwave antenna. Data collected in the field are transferred to the cloud and synthesized in an automated process to create reports of daily, weekly, and monthly vessel activity sent to those users verified by local partners. By automating both data collection and evaluation, managers receive data products they can integrate with expert knowledge of the area to support decisions related to the use of local fisheries resources. With over 20 systems deployed worldwide, including to remote areas in Ngarchelong, Palau, Turneffe Atoll, Belize, and others, M2 is a proven technology demonstrating how computing power available in the cloud can be applied to data collected locally. Data hosting on the cloud also facilitates exchange of information across multiple M2 sites and organizations. In this way, M2 integrates data and lessons learned about vessel activity from unique locations into the larger digital ecosystem, while also building local capacity for effective management with minimal resources on the ground.

13:30 – 13:39
[20]Democratising data for sustainable fisheries in the coastal tropics
Stephen Rocliffe
, Katie Stone

[abstract]Despite being one of the world’s most important food production systems, small-scale fishing remains one of the most data-poor sectors of the global economy. Gathering data in remote coastal communities can be time consuming and expensive. Since many local fishers lack the specific expertise to analyse data, the job typically falls to researchers or consultants. All too often, the results are not fed back to the communities themselves, methods are not standardised and data is walled off in siloed databases, greatly reducing its value. Although technological improvements are increasing data availability and quality in many countries, small-scale fishers are still far from having the information they need to make informed decisions about their resources. This information poverty poses risks that go well beyond weaker management and lower earnings, and exacerbates their disenfranchisement. In this session, we explore lessons learned from Blue Ventures’ (BV’s) work addressing data paucity in small-scale fisheries. For more than a decade, BV has trained local fishers in the coastal tropics to monitor catches with smartphones using a scientifically robust method. By using online platforms, the organisation has pooled data across multiple communities at different scales. The resulting datasets are open access and centralised to minimise costs and maximise utility. Working in this way means data can be interpreted, summarised and fed back to communities as rapidly as possible. This enables course corrections and improvements and ultimately transforms power dynamics. It can embolden small-scale fishing communities to engage more fairly with seafood markets, and to advocate for fisheries legislation that secures local rights. Connected, participatory data is one of the keys to delivering sustainable small-scale fisheries at scale.

1.5.2    Indigenous knowledge, information and data

13:39 – 13:48
[21]4 Oceans: Transgressing Time, Space, Indigeneity and Science
John Nicholls
, Laoise Dillon

[abstract] The 4 Oceans ERC project is engaged in historical, archaeological, scientific and theoretical aspects of research that seek to develop our understanding of human exploitation of our oceans through time. We are gathering data on at least ten species of marine animals and assessing the changing impacts of human interactions with these creatures. We deploy a raft of methodologies to address three fundamental questions: 1. When and where was marine exploitation of major significance to human society?
2. How did socio-economic, cultural, and environmental forces constrain and enable marine exploitation? 3. What were the consequences of marine exploitation for societal development and the oceans? Crucial to the investigations being conducted is the generation of several data series that transgress spatiotemporal boundaries and contest some existing preconceptions (for example: the validity and veracity of Indigenous Knowledge – IK). These series will provide quantitative, scientific data points that may enable insights into the shifting awareness, changing patterns and evolutionary deviations of humanity’s impact on the oceans over time, thereby informing the project’s primary lines of inquiry. To emphasise the process of deriving scientific data from raw historical materials, unproven qualitative statements and reported but unverified data, we will present a case study that highlights the transition of the discovery, multidisciplinary partnership fostering, transcription, recording, verification, formulation and publication processes. This case study will, by example, demonstrate that we engage with Data Science, and through data management, gain scientific insights from archival materials by transformation, qualification and quantification. We posit that the digitisation and provision of open-access historical data may enhance scientific understanding of past and present marine exploitation and facilitate greater insight into the current state of the oceans.

1.5.3    Citizen science data and information

13:48 – 13:57
[22]Spotter Pro – powering regionally and organizationally targeted opportunistic and effort-based citizen science on mobile devices
Virgil Zetterlind
, Deanna Richburg

[abstract] Given limited resources and the constant need for more in-situ observation of animal presence, citizen science applications offer researchers and resource managers the potential to drastically increase their available information. To be successful, citizen science applications face a difficult challenge of being accessible, compelling, and useful to the public while also achieving the data gathering objectives of their sponsors. Over the last ten years, Conserve.IO has developed and maintained a family of citizen science apps based on a common core called Spotter Pro. The apps include Whale Alert, Sharktivity, and Ocean Alert for iOS and Android phones and tablets. These apps provide easy to use sighting surveys that are location aware – tailoring species lists, response actions for injured / entangled animals, and overlays based on the user’s location. They also provide users information on best practices and rules for animal approach, active management zones (speed or other restrictions for whales), notification of closed beaches (sharks), and general information on nearby MPAs or other protected areas. To increase engagement, recent sightings are displayed in an interactive map – not only from other app users, but also from acoustic detections including buoys and gliders as well as formal effort surveys conducted by research organizations, offshore energy protected species observers, and governments. A common backend routes sightings by species, location, and animal status to resource managers and researchers in near realtime via email and machine to machine APIs. For effort-based surveys, organizations can create custom workflows to replace paper surveys — saving significant time by automatically capturing time and location for observations. Case studies from multi-year work in the Channel Islands National Marine Sanctuary, Glacier Bay National Park, and Stellwagen Bank will be presented.

13:57 – 14:06
[23]Litter Intelligence: Quality Citizen Science Inspiring Litter Action
Camden Howitt

[abstract] Marine litter and ocean plastics present major risks to people, culture, environment and economies around the world. Globally, we lack reliable and standardised data to inform better decision-making, and though concern about these issues is relatively widespread, communities often lack the capability, opportunity and motivation to engage in solutions. At the 2017 UN Ocean Conference, New Zealand-based NGO Sustainable Coastlines launched an ambitious voluntary commitment to create a system for Pacific-wide community-led solutions to plastic pollution. In 2018, with funding from NZ’s Ministry for the Environment, and in collaboration with Stats NZ and the Department of Conservation, the NGO developed the country’s first national litter monitoring programme, enabling communities to collect robust scientific data, gain insights and take action to prevent litter. At the programme’s core is a world-class database and technology platform, providing instant and open access to marine litter data, data visualisations, and analysis tools. Microsoft provided additional funding to build image-recognition tools that further enhance data quality. Sustainable Coastlines currently trains and supports citizen scientists to monitor litter at 250 coastal sites around New Zealand and 6 pilot sites around the Pacific. To provide more holistic data, the NGO recently integrated freshwater and stormwater litter data into the platform, with terrestrial litter data integration currently in progress.
The programme adheres to strict open data and transparency principles. Litter Intelligence has already achieved major milestones, allowing citizen scientists for the first time to contribute to official ‘Tier 1’ national environmental reporting, informing major changes in plastics and waste policy, and becoming one of only five programmes globally that enable communities to monitor official progress against the SDGs. The programme and technology was trialled in Samoa for SPREP’s 2019 Pacific Environment Forum, and in September 2021 groups across Samoa, Tonga and Wallis & Futuna piloted the programme.

14:06 – 14:15 [on-site]
[24]Fishing from an ocean of data to foster the development of a knowledgeable and ocean friendly society
Tymon Zielinski
, Marcin Wichorowski, Taco de Bruin, Izabela Kotynska-Zielinska, Tomasz Kijewski, Paulina Pakszys, Katarzyna Romancewicz and Aleksandra Koroza

[abstract]During this presentation we explore the role of researchers and citizens in supporting community led action on marine sustainability, marine pollution, climate action, and community resilience through engagement and outreach. Co-designed in collaboration with European partners and community representatives, the presentation will explore opportunities and challenges in communication of scientific information to general public. We will present a number of examples on dedicated ocean actions to promote effective transfer of ocean related knowledge to citizens. We will discuss issues which reflect the following themes: What resources can scientific organisations provide to support effective knowledge transfer? What format (infographic, brief, video, presentation, etc.) is most effective?
How can we identify audience preferences (with regards to engagement activity and output/resource format)?

1.5.4    LDCs

14:15 – 14:24 [on-site]
[25]Building ocean STI agreement platforms: findings, implications and lessons from a pilot experience on the South Atlantic
Iara Costa Leite
, Carolina Veras Micheletti and Guilherme Kiraly Robles

[abstract]The global emergence of the ocean agenda brings many opportunities and challenges for developing countries. For instance, marine biodiversity can impact job creation, but sustainable practices and knowledge that can be available elsewhere are needed. International collaboration in science, technology and innovation (STI) can be an effective instrument for developing countries to face challenges and potentialize opportunities related to the ocean agenda. One avenue for identifying partners is to generate data on Ocean STI international agreements. As a pilot initiative we mapped international agreements signed by South Atlantic countries. First, governmental online agreement platforms were accessed. In the case of Brazil we also built a platform with agreements which were available only at the Ministry of STI’s paper archive in Brasília. Once ocean-related agreements were mapped, we selected those containing elements of ocean STI, which were systematized according to the following criteria: title, co-signing countries or organizations, partner region, platform or archive of origin, agreement type (umbrella agreement, memorandum etc.), geometry (bilateral, multilateral), axis (South-South, North-South), signing year, validity, principles (mutual gains, reciprocity etc.), ocean STI areas mentioned in the agreement, signing organizations, and implementation timeline and budget. One of our main findings is that, despite the increasingly frequent signing of agreements involving ocean STI in the South Atlantic, they rarely include timelines and budget. A second finding points to similar numbers of agreements in both axis, which can indicate that South Atlantic governments understand that ocean knowledge is not concentrated in Northern countries. We believe that ocean STI agreement platforms can support stakeholders interested in promoting ocean STI collaboration to: identify partners with which agreements were already signed in specific priority areas; and better articulate with governmental agencies so that negotiations that precede written agreements can rely on broad participation of implementing actors, such as scientists and entrepreneurs.

1.5.5    Digitizing offline data and information

14:24 – 14:33
[26]Creating interactive, visual, open data from historical rare books
Stephanie Ronan, Maurice Clarke

[abstract] The Marine Institute Ireland, recently launched an Interactive Marine Archive comprised of digitised and visualised, historical resources dating back to 1839. The collection covers Irish Sea and Inland Fisheries reports (1839-1987), and the Scientific Investigations collection (1901-1926).These reports were only previously available by appointment, on-site in the Marine Institute archive. However, demand has been consistent, and the books were deteriorating. Researching the data was being duplicated ever few years, both finding and reproducing it, factors leading to the necessity to digitise and make the material accessible.The data contained in the reports include annual fish catches per port and gear type for about 100 years. They also contain details of native oyster beds, and other habitats listed on the Habitats Directives and the OSPAR list of Threatened and Declining Species. Other valuable datasets are inventories of benthic and pelagic species recorded in trawl surveys, plankton tows and other sampling exercises. These datasets could act as essential baseline information on the state of the Irish marine environment in the pre- or early exploitation era. Working with these baselines, we can offer advice on the impacts of climate change and fishing to government.The project was a collaboration between the Library and the Fisheries department and funded under the European Maritime Fisheries Fund. The collection was assessed for repair, digitised, OCR applied and uploaded to the Marine Institute Open Access Repository. Data were extracted from the digitised reports, to create a user friendly Interactive Marine Archive. Interactive features include a Timeline, Oyster Habitat Map, Historical Landings graphs, Meet the Scientists, Research Vessel information and an interactive book viewer.This project results in these historical reports becoming FAIR and part of the digital ecosystem. It showcases a visionary approach to both accessing and understanding our rare, valuable, previously inaccessible collections.

14:33 – 14:42
[27]You digitized it, but now what? Exploring computational methods for extracting biodiversity data from historical collections
Amanda L. Whitmire

[abstract]Climate change is driving rapid changes in our biosphere on local and global scales. Our capacity to understand these shifts relies entirely upon two critical things: long-term observations, and an ability to discover and access them. Species occurrence data, which includes the occurrence of a species at a particular place on a specified date, are foundational to understanding biodiversity and tracking changes due to the effects of climate change. Open knowledge bases that gather species occurrence records enable researchers to assess spatio-temporal changes in biodiversity, but observations from the past recorded on paper are often missing. Libraries at several academic marine research stations on the West Coast of North America hold large physical collections of undergraduate student reports. These reports include field observations of species occurrences and populations recorded over a span of nine decades. Each library collection is important within its local context, but taken collectively these papers represent an extremely valuable corpus for conducting biodiversity research. Even after digitization, however, observational data in these papers are still “hidden” in the text. Reading and extracting those data by hand is an effort we cannot realistically undertake. In this presentation, I will describe a collaborative project in which we explore the potential of natural language processing, machine learning, and data visualization to identify and verify species occurrences in unpublished student research papers. I will review how we approach identifying relevant entities in the texts, link them to taxonomic authorities, and create derivative datasets. The final goal of the project is to serve the species occurrence metadata to relevant aggregators, e.g., the Global Biodiversity Information Facility. The overarching message of this talk will be how we can take advantage of computational methods to amplify the work of information professionals in surfacing historical biodiversity data.

14:42 – 14:48 Q&A
14:48 – 14:58 Health break

SESSION 2: IMPLEMENTING THE DIGITAL COMMONS

2.1       Data system networking and interoperability technology and methodology: status report

14:58 – 15:07
[28]Blue-Cloud: the European marine thematic platform to explore and demonstrate the potential of Open Science for ocean sustainability
Sara Pittonet Gaiarin
, Dick M. A. Schaap, Pasquale Pagano, Julia Vera and Dominique Obaton

[abstract] The European Open Science Cloud (EOSC) was launched by the European Commission in 2016 to provide a virtual environment with open and seamless services for storage, management, analysis and re-use of research data, across borders and disciplines. Within this framework, Blue- Cloud, as a « Future of Seas and Oceans Flagship Initiative’ of EU HORIZON 2020 programme, is the thematic EOSC for the marine domain, delivering a collaborative virtual environment to enhance FAIR and Open Science. Started in October 2019, Blue-Cloud is deploying a cyber platform with smart federation of an unprecedented wealth of multidisciplinary data repositories, analytical tools, and computing facilities to explore and demonstrate the potential of cloud-based Open Science and address ocean sustainability, UN Ocean Decade, and G7 Future of the Oceans objectives. Blue-Cloud federates leading European marine Research infrastructures (SeaDataNet, EurOBIS, Euro-ARGO, ICOS, SOCAT, ELIXIR-ENA, EMODnet, CMEMS) and e-Infrastructures (EUDAT, D4Science, WekEO DIAS), allowing researchers to combine, reuse, and share quality data across disciplines and countries. The federation takes place at the levels of data, computing and analytical service resources.Blue-Cloud cyber platform is exploiting:

  • The Blue-Cloud Data Discovery & Access Service, enabling users to find and retrieve multi-disciplinary datasets from multiple repositories.
  • The Blue-Cloud Virtual Research Environment, facilitating access and orchestration of computing and analytical services for constructing, hosting and operating Virtual Labs for specific applications.
  • Five multi-disciplinary Blue-Cloud Virtual Labs, configured with specific analytical workflows and serving as real-life Demonstrators, which can be adopted and adapted for other inputs and analyses.

The sustainability of Blue-Cloud model and technology is being analysed in its “Roadmap to 2030”, developed in dialogue with stakeholders as a policy document for future strategic exploitation by the wider marine community in Europe and beyond. Blue-Cloud is also building tight connections with a range of projects and infrastructures in the marine domain.

15:07 – 15:16
[29]Using a cloud-based architecture to lower the barrier to accessing open ocean data (WOD, OBIS)
Tara Zeynep Baris, Jo Øvstaas and Thomas Fredriksen

[abstract] The Ocean Data Platform (ODP) developed by the C4IR Ocean in Norway, has mirrored the World Ocean Database (WOD) in the Microsoft cloud platform, Azure. The ODP provides access to the data through the Ocean Data Connector — a very powerful tool for exploration and analysis of WOD and OBIS data. With this tool, researchers can avoid expensive computer setups, or even supercomputer time, and perform data analysis and modeling efficiently. The Ocean Data Connector is powered by the Jupyter ecosystem, and Dask, an open-source library for parallel computing. By integrating with our data-platform, we bring the data close to the compute-environment so that oceanographic data can quickly be queried, fused, and analyzed in the cloud, eliminating the obstacles of storage space and internet speed. Additionally, data will be delivered in a single easy-to-use format. Our platform supports a variety of hardware configurations, ranging from low-resource environments like a personal laptop, to high-resource environments with options of adding graphical processing units (GPUs) and other specialised hardware. The user is also able to create further compute-resources in the form of clusters – personalised on-demand supercomputers supported by Dask. Users will be able to share and collaborate real-time in the same environment, and all results will be reproducible and backed with full data lineage, meaning that replicating published results has never been easier.

15:16 – 15:25
[30]Portal proliferation or strengthening the ocean data network
Laura Hanley
, Oliver Williams and Natasha Taylor

[abstract] Our increased volume of ocean data openly shared has increased the number of recognised data centres (NODCs and ADUs following IOC Data Policy), as well as global databases. To feed into this infrastructure there’s also a growing proliferation of marine and ocean data portals. Let’s celebrate the transformation and give recognition of the value to underpin ocean science outcomes. Better buy-in however, is still much needed toward the modern federated interconnected approach, to meet users’ requirements. This presentation will support the vision for an interoperable network with seamless access to marine data inventories and data collections, and toward implementing a ‘digital ocean ecosystem’ to support the Ocean Decade. We’ll provide real-world examples to demonstrate value, outline different needs for different users, and we’ll discuss the current situation in relation to data flows from our own Cefas Data Portal, both existing and envisaged (e.g., including reference to OBIS, EModNet, MEDIN, SeaDataNet). We’ll provide background in context to build a shared understanding on the current complexities with networking the required data portal landscape, and we’ll argue that source data systems should continue to maintain control where most appropriate, to save and better focus limited resource. We’ll also explore the changing mindset needed towards this evolving approach. Results: Encourage discussion, collaboration, and shared knowledge of approach, Optimising data flow to better underpin global scientific outcomes, Better prioritise national and regional efforts, Learnings to include: What is a data portal? Real-case examples and types. Capabilities for accessing data, from downloading to api and web map services common data and metadata standards. A user centric focus to cover ‘What do different users really want?’ – Generalist users compared to research specialists, and we’ll discuss barriers faced and user frustrations.

15:25 -15:34
[31]EMODnet Physics – connecting the data dots
Antonio Novellino
, Patrick Gorringe

[abstract] EMODnet Physics, as one of the seven European Marine Observation and Data network program implementation projects, designs, organises and runs operational services unlocking and providing ocean physical data and data products to the wider user community. EMODnet Physics products range from in situ data collections to reanalysis, trends, and climatology. Data and data products are accompanied by necessary metadata as well as quality check applied procedures. On top of these data products, the system implements standardized interoperability services (ISO 19115- 2/19139, OPeNDAP, Web Coverage Service (WCS), Web Map Service (WMS), Web Feature Service (WFS), etc.) as well as custom web API and widgets. In Europe, EMODnet Physics is strongly federated with the other major data aggregating infrastructures: SeaDataNet and its network of National Oceanographic Data Centres (NODCs), EuroGOOS ROOSs and the CMEMS IN SITU TAC. EMODnet Physics is also integrated with other data sources beyond Europe (ICES, PANGAEA, GLOSS, …). Through its vast network, including industry partners, EMODnet Physics plays a key role in bringing together and supporting diverse marine communities and emerging networks in order to enhance the cooperation and by this increase the accessibility to marine data. EMODnet Physics links these activities to regional, European and global programs and initiatives and by this acting as an enabler in connecting local/regional marine observing activities to the global scale. In this process EMODnet Physics acts as an ambassador and promoter of global data recommendations, standards etc. In this way it interacts with international data collection networks (GOOS – Global HFR, AniBOS, OceanGlider, …) and programmes (SOOS, DOOS….). These collaborations have largely increased the data connected to the EMODnet Physics portal. EMODnet Physics also facilitates development of dedicated mirror versions of its data portal to further stimulate data sharing to provide the data we need for the ocean we want.

15:34-15:43
[32]Towards a metadata profile for Marine Spatial Plans in Europe
Adam Leadbetter
, Alexia Attard, Andrej Abramic, Andrew Conway, Elizabeth Tray, Monica Campillos, Joni Kaitaranta, Adeline Souf, Alessandro Sarretta, Olivdo Tello, Michail Vaitis and Clara Zimmer

[abstract] Marine or Maritime Spatial Planning (MSP) is the process whereby the authorities analyse and organise human activities in marine areas to achieve ecological, economic and social objectives. For this purpose, MSP plans in EU member states should be elaborated due to the MSP directive (2014/89/EU), which both creates data in the form of Marine Plans and is intensive in consuming data as these Marine Plans are being developed. In order to improve the findability, accessibility and usability of generated Marine Spatial plans and to link to the datasets informing the development of those plans, the Technical Expert Group (TEG) on Data for MSP of the European MSP Platform has identified the need to agree and develop a metadata standard for the description of Marine Spatial Plans. This work needs to focus on the needs of end users in terms of discovery of marine spatial plans; governance (including monitoring) of marine spatial plans and the update of MSP plans. The developed standard must be descriptive enough to capture sufficient information to meet these user needs, whilst not being too onerous for creators. In order to develop this metadata profile, a working group from the TEG has been convened which has identified 16 use cases for the metadata profile ranging from simple discovery of the Marine Plan and the responsible authorities through to identifying policies related to the Marine Plan and identifying datasets used to inform the development of the Marine Plan. Existing formal and informal metadata profiles which may be used to encode information about Marine Plans have been evaluated against these use cases. In this paper we will describe the methodology used to identify the requirements of the metadata profile for Marine Plans and present the initial development of a metadata profile and describe the next steps which will allow the metadata profile to be connected to systems such as Digital Twins and the OceanInfoHub platform.

15:43 – 15:49 Q&A
15:49 – 16:00 Health break

16:00 – 16:09
[33]Connecting Essential Ocean Variables to datasets using the I-ADOPT interoperability framework ontology and smart mappings
Gwenaelle Moncoiffe, Alison Pamment and Alexandra Kokkinaki

presentation slides with information

[abstract] Introduction and Objective
Interoperability between existing data systems is hindered by lack of machine-readable connections between terminologies. Programmes such as IOC’s International Oceanographic Data and Information Exchange have, for many years, encouraged the adoption of common metadata standards, including recommendations on specific formats and associated controlled vocabularies. Building on these standards, the digital commons for the Ocean Decade need adaptable and connected methods to serve a broad range of end-users and system requirements. This paper will present how two operational parameter naming schemes (Climate and Forecast (CF) Standard Names and BODC Parameter Usage Vocabulary (PUV)) can be efficiently connected using a novel approach to mappings and a common ontological framework. Materials and Methods: We will use the framework developed by the Research Data Alliance’s InteroperAble Descriptions of Observable Property Terminologies (I-ADOPT) Working Group to map concepts from CF Standard Names and BODC PUV to a set of common terminologies. Using the NERC Vocabulary Server (NVS) we will show how these mappings can be re-used in sparql queries to fulfill a variety of scenarios. One of our test scenarios will be accessing data that relates to selected Essential Ocean Variables. Results :
We will show the result of the mapping exercise, highlighting tips and pitfalls, and present the outcome of our “smart” mapping approach for a selected subset of test scenarios. Relevance to data exchange:CF Standard Names and BODC PUV are both used by key data infrastructures in the marine data ecosystem. Connecting the two and demonstrating a method that could be applied to other terminologies facilitates data flows and enables reproducible and dynamic data aggregations needed to support the Global Ocean Observing System and the UN Sustainable Development Goals. Conclusions:Using a novel approach, we are connecting two key semantic resources to each other and to other ontologies and terminologies, with the potential to enable multidirectional interoperability between data resources and services across and beyond the marine domain.

16:09 – 16:18
[34]Digital Twins of the Ocean through Interoperability of existing and future Ocean Data Systems
Ute Brönner
, Kristine Bauer, Arne J. Berre, Uwe Freiherr von Lukas, Jay Perlman, Georgios Sylaios and Martin Visbeck

[abstract]Digital Twins are already the state of the art in many industries. Manufacturing, health, supply chains, product lifecycle management, energy, transport, and other fields rely on Digital Twins for testing of What-if-scenarios and decision making. A Digital Twin of the Ocean will allow this kind of scenario testing for applications like Marine Spatial Planning, environmental impact assessment of anthropogenic activities and climate change, safe and sustainable operations in the maritime industries (i.e., marine renewables, fisheries, aquaculture, marine traffic, etc.) and to study effects and responses before they occur. In this work, we will point out how the implementation of an Interoperability Data Space and federated access to different already existing Earth Observations systems and ocean models will realize the Digital Twin of the Ocean as a system of systems. The EU Green Deal project ILIAD and the endorsed UN program DiTTO will implement interoperable, data-intensive Digital Twins of the Ocean (DTO). They will capitalize on the explosion of new data provided by the many existing different earth observation and modelling systems and combine them with Industry 4.0 systems and applications (like the Internet of Things, AI Analytics, Social Networks, Big Data, Cloud Computing, etc.) in a user-engaging and intuitive fashion, including virtual/augmented visualisation. The DTO will exploit all facets of geospatial data and cover water column, deep sea, sea floor, land, atmosphere, sea ice, and climatic components.
The implementation of the Digital Twin of the Ocean as an Interoperability Data Space built upon standards and best practices will allow a multitude of researchers, commercial end-users, policy makers, and the broad public, to benefit from access to information and knowledge by following the principles of findability, accessibility, interoperability, and reusability (FAIR data). A marketplace for services and applications will ensure engagement, maintainability, future enhancements, and broad acceptance of the DTO.

2.2       Joining multi-sectoral data: experiences and required action

16:18 – 16:27
[35]Technical solution for harmonizing EU Maritime Spatial Planning within EMODnet
Andrej Abramic
, Joni Kaitaranta, Jose Santiago, Marta Ballesteros, Alessandro Sarretta, Alessandro Pititto and MSP Assistance Mechanism

[abstract] In April 2020, DG MARE and EASME, supported by the MSP Assistance Mechanism, formalized the establishment of the Technical Expert Group (TEG) on Maritime Spatial Planning (MSP) data, to increase dialogue within the European MSP community, moving towards more robust and common standards in terms of harmonized national MSP plans. After almost a year, the TEG has proposed a technical solution for harmonizing the data of maritime spatial plans and producing a coherent European MSP map. It is suggested that Member States choose one of the following three recommended “ready-to-use” MSP data models: BASEMAPS: it is a robust and already operational solution running within the HELCOM spatial data infrastructure, with established data flow and experience in merging MSP plans at the marine basin level within HELCOM-VASAB cooperation. It is a simple flat data model, with general code lists, following INSPIRE principles based on Land use data specifications. MSP INSPIRE data model: it is based on the Planned Land Use INSPIRE model, adapted for MSP, extending the classification system to fit specific and detailed maritime uses and introducing information on vertical zoning extension. It is a flat data model, suitable to specify a high level of detail within maritime uses. EMODnet MSP model: it is a hybrid solution based on the MSP INSPIRE data model, adding specific features from BASEMAPS. It has been developed by the EMODnet Human Activities team, supported and fine-tuned by the TEG. This technical solution enables the conversion of the existing national operational models into a EU-level generalized and modeled overview on the EMODnet portal. This harmonization ensures integrity and versatility: original and harmonized plans are visible, allowing for comparison across countries, uses and activities. In October 2021, five plans from EU member states are already available.

2.3       The IOC Ocean Infohub: experiences and next steps

16:27 – 16:36
[124]The Ocean Infohub Project
Lucy Scott
, Zulfikar Begg, Pier Luigi Buttigieg and Peter Pissierssens

[abstract] The Ocean InfoHub (OIH) Project aims to improve global and equitable access to ocean information, (meta)data and knowledge products for management and sustainable development. The project will leverage a digital exchange architecture to power an openly accessible web platform supporting communication and interoperability between distributed and independent digital systems. To this end, the OIH has supported the development of the Ocean Data and Information System Architecture (ODIS-Arch), reusing common data exchange conventions such as schema.org, to allow existing and emerging ocean data and information systems, from any stakeholder, to begin interoperating with one another and solutions such as OIH. This will enable and accelerate more effective development and dissemination of digital technology and sharing of ocean (meta)data, information, and knowledge. OIH will thus support ODIS – the sum of all participating systems linked through ODIS-Arch – in a co- design process grounded in continuous and actionable digital exchange. OIH will benefit marine and coastal stakeholders across the globe, but its initial focus will be on responding to requests for data products and services from three regions: Africa, Latin America and the Caribbean, and the Pacific Small Island Developing States. The initial priorities for the Project are to develop communities of practice for the three pilot regions, and ODIS-Arch specifications for six priority themes: (i) Experts and institutions/organizations, (ii) Documents, (iii) Spatial data and maps, (iv) Research vessels, (v) Education and training opportunities, (vi) Projects. We will present an overview of the digital architecture, as well as brief use cases by each of the regions, future plans, and contributions to the UN Decade of Ocean Science for Sustainable Development.

16:36 – 16:45
[36]Providing Information for the Ocean Infohub
Sandra Sá
, Sérgio Bryton

[abstract] Introduction:High quality information on marine Research Projects and Vessels is essential for ocean science and technology development. Identification of capabilities, gaps and opportunities, decision making on development and funding priorities, and education and training are examples of activities needing this type of information. However, having a seamless and integrated access to ocean information, including Research Projects and Vessels, is a challenge. Methods:UNESCO-IOC is developing the Ocean InfoHub (OIH) to address this challenge, in cooperation with other relevant partners, including EurOcean. Since 2002, EurOcean provides comprehensive databases of information on topics related to marine science and technology in Europe with priority given to two main domains: Marine Knowledge Management (KG), and Marine Research Infrastructures (RID). EurOcean is developing a data lake aiming the real-time provision of Research Projects and Vessels information to the OIH.Results:EurOcean data lake will comprise information about more than 1.000 infrastructures from 35 countries divided by 15 categories, and information on over 7000 marine European and national research projects and results. Through OIH information about Research Projects and Vessels will be made available, together with other ocean related information, to support researchers, policy makers, think tanks and funding programmes, among other stakeholders. The resulting technological solution will be made available through open source licenses, thus enabling related initiatives to benefit from the knowledge acquired from this endeavour. Conclusion:For over 20 years, EurOcean has gathered, maintained, and provided free access to information about marine Research Projects and Infrastructures to its stakeholders through the RID and KG. Both databases have facilitated various marine research, policy support, education and training activities in Europe and worldwide. By developing a data lake and integrating it with the OIH, EurOcean will contribute to seamlessly enable access to “The Data We Need for the Ocean We Want”.

2.4       Data and information products and services: new developments

16:45 – 16:54
[37]ProtectedSeas Navigator – how regulation-centered marine protected area data improves marine protection assessments
Timothe Vincent
, Claire Colegrove, Alex Driedger and Jennifer Sletten

[abstract] Accurate data about MPAs and marine managed areas (MMAs) is critical to assess the marine protections currently in place and to inform the creation of new areas. The ProtectedSeas Navigator is a guide to the regulatory seascape that contains information on regulatory protections for ocean spaces around the globe. Navigator is an open access resource available via an interactive public map, the ESRI platform, and file downloads. This database provides detailed and up-to-date information on the activity restrictions that apply within each MPA and MMA. It is the only resource of its kind. Navigator uses a coding system to indicate the level of restriction for over 25 marine activities, including extraction, entry, speed, anchoring, commercial and recreational fishing, and certain types of fishing gear. These regulatory data, combined with verified boundaries, support complex analyses and assessment of marine protections as well as protection comparisons between countries and regions. Navigator can help with marine protection assessments towards national and international conservation targets. As of October 2021, Navigator includes North America, Europe, the High Seas, and many other countries worldwide representing over 80 countries and territories and 12,000 individual areas. The team is committed to completing a global database in 2022. While progress toward conservation targets has historically relied on coverage statistics derived from protected area boundaries, there is added value in looking beyond spatial boundaries to see protection potential through the lens of regulation-based protection and spatially cumulative gear and activity restrictions. In some regions, the Navigator team found a large degree of regulatory and jurisdictional overlap. Given these overlapping place-based measures, Navigator can be used to help explicitly measure cumulative management impact to obtain a more accurate picture of protection status and marine protection assessments.

17:03-17:12 [on-site]
[39]C-RAID: improve the access to historical drifter data: Copernicus Reprocessing of Argos and Iridium Drifters (C-RAID)
Thierry Carval
, Patricia Zunino Rodriguez, Jean Philippe Rannou, Paul Poli, Frédérique Blanc and Christophe Billon

[abstract] C-RAID is a data rescue project for a global reprocessing of drifting buoys data and metadata: 25 000 drifting buoys deployed between 1979 and 2018. The data of 10 000 drifting buoys deployed between 1997 and 2010 have been delayed mode processed (including comparison with ERA5 reanalysis). The project continues in 2022 to reprocess the remaining 15 000 drifting buoys.
The C-RAID deliverables are: An improved drifting buoys data archive and FAIR interfaces to drifting buoys data: Web data discovery for human users & API data discovery/subsetting/download services.What do we mean by “Improved drifting buoy data record”: Missing datasets and parts of datasets recovered (data rescue), Homogeneous & rich metadata and data format (NetCDF-CF) ,Homogeneous expert QC on marine and atmospheric data, Matchup ERA5 data (temperature, atm pressure, wind), C-RAID phase1 data, metadata and documentation is now published on https://doi.org/10.17882/77184 Our FAIR data commitments:

  • Findable: DOI published on DataCite & Google Schema.org, link with bibliography and authors (ORCID)
  • Accessible: one click download, anonymous access, a bigdata API and a highly interactive facetted portal
  • Interoperable: CF and SeaDataNet standards, QC documented
  • Re-usable: CC-BY license

The C-RAID stakeholders are within Ocean-Atmosphere community: C-RAID high resolution data complements ICOADS buoys data and improves ERA reanalysis. Copernicus Marine service (CMEMS) and Copernicus Climate Change service (C3S) use C-RAID temperature for satellite calibration/validation, model and reanalysis validation or forcing (global “OSTIA” L4REP and L4NRT use drifter SST to complement satellites with a maximum nb of sensors, to solve mesoscale structures). The drifter SST data are assimilated by CMEMS MFCs to constrain the surface fields. The global ESA SST CCI and C3S L4REP use in situ drifter SST to validate fields built from satellite only data.

17:12-17:21
[40]A global ocean oxygen database and atlas for assessing and predicting deoxygenation and ocean health in the open and coastal ocean
Marilaure Grégoire
, Véronique Garçon, Hernan Garcia, Denise Breitburg, Kirsten Isensee, Andreas Oschlies, Maciej Telszewski, Alexander Barth, Henry Bittig, Jacob Carstensen, Thierry Carval, Fei Chai, Francisco Chavez, Daniel Conley, Laurent Coppola, Sean Crowe, Kim Currie, Minhan Dai, Bruno Delflandre, Boris Dewitte, Robert Diaz, Emilio Garcia-Robledo, Denis Gilbert, Alessandra Giorgetti, Ronnie Glud, Dimitri Gutierrez, Shingeki Hosoda, Masao Ishii, Gil Jacinto, Chris Langdon, Siv K. Lauvset, Lisa A. Levin, Karin E. Limburg, Hela Mertens, Ivonne Montes, Wajih Naqvi, Aurélien Paulmier, Benjamin Pfeil, Grant Pitcher, Sylvie Pouliquen, Nancy Rabalais, Christophe Rabouille, Virginie Recape, Michaël Roman, Kenneth Rose, Daniel Rudnick, Jodie Rummer, Catherine Schmechtig, Sunke Schmidtko, Brad Seibel, Caroline Slomp, U. Rashid Sumalia, Toste Tanhua, Virginie Thierry, Hiroshi Uchida, Rik Wanninkhof and Moriaki Yasuhara

[abstract] Oxygen (O2) is considered as an effective indicator of ocean health and climate change. Its level, distribution and variability from sub-seasonal to multidecadal scales give relevant information on the physical and biogeochemical functioning of the ocean. Current evidence indicates that the coastal and open ocean is losing O2 since the middle of the last century with consequences for living organisms and biogeochemical cycles that are not yet fully understood. Yet, currently there are uncertainties and differences between estimates of the deoxygenation rates in the global ocean that could be attributable to the scarcity of accessible data, the use of different datasets and the employment of different mapping techniques and models. In this talk, we outline the need for a coordinated international effort towards the building of an open-access Global Ocean Oxygen Database and ATlas (GO2DAT) complying with the FAIR principles, combining data from the coastal and open ocean, measured from Eulerian and Lagrangian platforms and adopting a community-agreed metadata format, fully documented quality control and flagging procedures. A roadmap towards GO2DAT is proposed with involves the engagement from the scientific community, data providers, data managers and end-users. The building of GO2DAT would allow to fully harness the potential of the boost in O2 profiles that is expected to quadruple in the frame of the future GOOS strategy. It will allow the user to make an informed choice on which data that are fit for purpose and will facilitate the dissemination of information on ocean deoxygenation to a wide community of stakeholders and will contribute to the education of the young generation and general public. Engagement around GO2DAT will be promoted by the UN Decade Global Ocean Oxygen Decade (GOOD) program.

17:21-17:27 Q&A
17:27 Adjourn

DAY 2: 15 February 2022

09:00 – 09:09
[41]ASFA Subject Vocabulary: supporting internationalisation of ocean science through participation in AGROVOC
Tamsin Vicary

[abstract] The ASFA Subject thesaurus has been in use since ASFA first began publishing its database as a printed abstracting and indexing journal in 1971. Covering all aspects of oceanography and aquatic sciences, the thesaurus has been maintained by a wide range of international experts, including the Food and Agriculture Organization of the United Nations (FAO). Uses of the ASFA thesaurus include to index records on the ASFA database and at a number of repositories and library catalogues around the world, including AquaDocs.In parallel to ASFA’s development, FAO created in 1980s the multilingual AGROVOC thesaurus which covers all FAO’s areas of interest, including fisheries and aquaculture. AGROVOC has become a key vocabulary, widely used with more than 50M accesses a year. AGROVOC applies international data models based on World Wide Web Consortium (W3C) recommendations and the Semantic Web, with the objective to facilitate the accessibility and visibility of any data indexed using the vocabulary. In 2016 the FAIR principles were defined with the objective to promote (F)indable, (A)ccessible, (I)nteroperable and (R)eusable data. AGROVOC complies to these principles by promoting (I)2, the description of metadata elements should follow community guidelines that use open, well-defined vocabularies: “The controlled vocabulary used to describe datasets needs to be documented and resolvable using globally unique and persistent identifiers. This documentation needs to be easily findable and accessible by anyone who uses the dataset” (https://www.force11.org/group/fairgroup/fairprinciples).Since joining AGROVOC in 2019, the ASFA Thesaurus has been transformed from paper to digital. We demonstrate how this transformation has improved multilingualism, standardised terminology and applied international data standards in order to increase access and visibility of resources using the ASFA thesaurus. We will place these changes in the context of ASFA’s broader transformation to an open information network focused on improving access to information and data on the world’s oceans and aquatic environments.

09:09 – 09:18
[42]PORTO Online – Featured data services in practice
Aldo Drago
, Audrey Zammit, Adam Gauci, Anthony Galea, Joel Azzopardi, Raisa Galea de Giovanni, Giuseppe Ciraolo, Fulvio Capodici, Salvatore Aronica, Alessio Langiu, Giovanni Giacalone, Ignazio Fontana, Rosario Sinatra, Elisabetta Paradiso, Salvatore Campanella and Vincenzo Ruvolo

[abstract] The digital era has opened new realms for ocean data delivery. As more users become dependent on reliable information deriving from multiple data sources, the non-professional users are increasing in numbers with different demands from those of professional users. Technology is leading to a step shift in the value addition chain of data to information, knowledge and intelligence, providing sophisticated user experiences online, with faster delivery and service elaborations on a wider range of more affordable smart mass media like iphones, ipads and other wireless devices. While providing more elaborated deliveries to expert users, the digital environments add a new dimension to data services by matching fast delivery to user experience, expectations and demand. This has led to the concept of featured data services. In contrast to the traditional generic services, the featured services are usually dedicated to a category of users with specific needs, providing routine data-derived products to support their operational day-to-day activities or production lines. User categories can be from industry, such as in relation to the tourism sector to provide higher level information services. Compared to the static delivery of pre-prepared products determined at source, the featured web services use a dynamic added value production composed directly by the online user who can customise the service to fit a specific need or query. Four essential ingredients of featured services are: (i) GIS-like web tools for geographic rendering and mapping; (ii) user specified dynamic content delivery; (iii) online functionalities to elaborate data online; and (iv) real- time updating of data and information. The CALYPSO South project (www.calypsosouth.eu) provides online featured data services through the PORTO Online offering met-ocean information delivered to especially aid harbour masters, port authorities and operators in the shipping and maritime services in the proximity of the Maltese Islands and south of Sicily.

09:18 – 09:27
[43]Increasing FAIRness of marine data within ENVRI-FAIR
Sylvie Pouliquen
, Thierry Carval, Valerie Harscoat, Peter Thijsse , Ivan Rodero , Benjamin Pfeil, Katrina Exter and ENVRI-FAIR WP9 partners

The ENVRI-FAIR project is engaging environmental Research Infrastructures (RIs) covering the subdomains Atmosphere, Marine, Solid Earth, and Biodiversity/Ecosystems. The overarching goal of ENVRI-FAIR is that all participating research infrastructures will improve their FAIRness and become ready for machine-to-machine access, e.g., for services in the European Open Science Cloud. The Marine subdomain of ENVRI-FAIR focuses on Euro-Argo-ERIC, EMSO-ERIC, ICOS and LifeWatch, as RIs listed on the ESFRI roadmap, as well as SeaDataNet as European marine data management infrastructure. The overall aim is to analyse the FAIRness of each of these RIs and implement the necessary actions to improve its FAIRness. Considering the ENVRI-FAIR challenge of multiple RIs and multiple subdomains, the agreed way forward is that the FAIR principles will be implemented within each RI to improve FAIRness at three levels: 1) to better serve its users; 2) to facilitate the development of cross-RI services at the Marine subdomain level; and 3) to facilitate the development of cross-subdomain services at the level of the ENVRI-FAIR cluster. The approach is bottom-up: respecting the autonomy of RIs concerning requirements and solutions, however, in close and regular interaction with experts in ENVRI-FAIR and shared implementation activities with other subdomains. Based on these enhanced services, a demonstrator in the form of a metadata broker will be available in early 2022 through the ENVRI-FAIR HUB as a scientific use-case. It will serve Copernicus Marine, SeaDataNet, and EMODnet as integrators and processors of Marine EOV data accessed via the sustained RI machine-to-machine enhanced services. Demonstrating that FAIR services at RI level facilitate the reusability of marine RIs data by integrated European services, it will pave the way to extending those methods to other Marine RIs and user communities in Europe, through a white paper for sustainable marine data management.

09:27 – 09:36
[44]Baltic Data Flows
Matthew Richard

[abstract]The Baltic Data Flows project, co-financed by the Connecting Europe Facility of the European Union, seeks to enhance the sharing and harmonisation of data on the Baltic marine environment originating from existing sea monitoring programmes, and to move towards service-based data sharing. In particular, open datasets will be made available by HELCOM to a wider community, such as European open data ecosystem, researchers, NGOs and private sector, in order to benefit from the availability of harmonised environmental data. Wider dissemination is achieved by sharing HELCOM metadata records to the European Data Portal. Baltic Data Flows will improve the capacity building of the national environmental data hosting organisations and providers of the consortium, in terms of quality control and solutions to make harmonised environmental data available. Members of the consortium will build and enhance their ICT infrastructure to support better the data sharing process. Furthermore, data harvesting systems based on Application Programming Interfaces (APIs) will be developed with the aim to automatically integrate national datasets into a combined and harmonised regional dataset for partner institutes. Furthermore, Baltic Data Flows will develop tools and indicator data flows for eutrophication, hazardous substances and biodiversity assessment. The work will include also further development of assessment tools and tool outputs in close cooperation with relevant expert groups. The project will run from October 2020 to October 2023. There are 7 partners involved in the project from across the Baltic Sea region. The IODE event will provide the opportunity to present the latest developments and progress with the project.

09:36 – 09:45
[45]Imardis – A Data Infrastructure Serving the Needs of the Offshore Renewable Energy Sector
David Mills
, Noel Bristow, Cathy Blakey, Gwyn Roberts, Sudha Balaguru and Vahid Seydi

[abstract] As part of the ERDF-funded SEACAMS2 programme the Imardis (Integrated Marine Data and Information System) data infrastructure has been developed to meet the requirements of the Offshore Renewable Energy (ORE) sector in Wales. They required easy access to very large and trusted data to reduce the time to insight and de-risk business decisions to progress the development of the sector in Wales. The objective was to build and operate a fit-for-purpose data infrastructure, allowing marine datasets to be discoverable and accessible. A stakeholder workshop and other engagement activities were undertaken to determine user requirements. The architecture is based on a series of micro-services, accessible through a RESTful JSON-based Application Programming Interface layer (API). The services are implemented in Java and deployed within the Amazon Web Services cloud environment. This is scalable in terms of storage capacity and throughput. A streaming data system is being developed to cater for next generation IoT sensors. Further analysis of the data is available through a data analytics portal. Imardis adopted the UK MEDIN metadata standards to ensure harmonisation with existing UK marine data management landscape and harmonised with European standards. The system ensures that access to data is FAIR (Findable, Accessible, Interoperable and Reusable), with a data discovery and download service to promote data exchange.Results :The system has been live since early 2020 and has attracted over 100 users from all sectors. There are currently 350 datasets available for interrogation and download, covering bathymetry, CTD, ADCP and sub bottom profile, with more planned for the future (e.g. grab samples, passive acoustic, wave data).The development of the Imardis system has shown that large marine datasets can be made available to industry by the use of a fit-for-purpose architecture, demonstrated by the growing userbase.

09:45 – 09:54 [recording]
[46]New Developments with the Open Access to the GTS (Open-GTS) project
Kevin O’Brien
, Kevin Kern, Bill Smith, Darin Figurskey, Darren Wright, Brian Tetreualt, Greg Johnson, Hermann Asensio, Antje Schremmer, Kai-Thorsten Wirt, Simon Elliot and David Berry

[abstract] The majority of in situ near real time marine meteorological and oceanographic data used in operational forecasts and forecasting centers is distributed and accessed via the WMO’s Global Telecommunications Systems (GTS), a component of the WMO Information System (WIS). The Open Access to the GTS (Open-GTS) project was developed to improve access to GTS for the exchange of observation data in near real-time. The project has been developed and supported by the GOOS Observation Coordination Group (OCG) and involves several international partners. In collaboration with the US Integrated Ocean Observing System (IOOS), an Open-GTS implementation strategy was developed that provides a roadmap for others to implement the defined workflow. This workflow is being implemented operationally through the US IOOS Regional Associations (RA) for submission of their data to the GTS. In addition, the Open-GTS recently partnered with the NOAA/NWS and US Army Corps of Engineers to develop a pilot project to receive, process and submit AIS meteorological and ocean (METOC) data from ships to the GTS. This workflow has the potential to significantly increase the amount of coastal met-ocean data available for weather and safety of life at sea forecasts. In this presentation, we will discuss some of the new developments of the Open-GTS project, including experience with 3rd party ocean vessel data as well as the potential links to several UN Ocean Decade programs and projects. We will also illustrate how the Open-GTS fits into the WIS 2.0 evolution and describe a WIS 2.0 data exchange pilot project that is based upon Open-GTS services.

09:54 – 10:00 Q&A
10:00 – 10:10 Health break

10:10 – 10:19
[47]Navigating an Ocean of Data: Approaches for Sharing Novel USV Data with Existing Data Centers
Kimberley Sparling
, Eric Lindstrom

[abstract] Uncrewed surface vehicles (USVs) are a relatively new data collection platform for the in situ oceanographic community. Primarily wind- and solar- powered, Saildrone USVs have the ability to endure deployments at sea for up to 12 months and can withstand extreme weather conditions such as a Category 4 hurricane in the Atlantic, the Southern Ocean winter, and at the edge of Arctic sea ice. As a result, there is a growing number of unique data sets acquired via Saildrone USVs from around the globe. Quality assessment, review, and archive of those data sets are typically the responsibility of the institution or researcher who commissioned the “mission.” This approach is sufficient for a small number of missions and campaigns however, there is a growing need to unify the approach for quality checks, storage, and access to these datasets for broad use across the scientific community as the fleet size and variety of deployments increase for saildrones and other classes of USVs. It is also clear that USV data require cataloguing with a unique platform classification, highlighting their difference from ship and research vessel data and drifting and moored buoy data. This presentation will focus on the approaches taken to date with the handful of data centers that have archived and disseminated Saildrone USV data. Examples include the NASA JPL PODAAC, EMODnet, and the NOAA OpenGTS project. The objective is to identify current and future challenges for the ocean data community for incorporating this growing data source in research and operational applications.


10:19 – 10:28 [on-site]
[48]Ocean Acidification Data for Sustainable Development – implementing an interoperable infrastructure
Benjamin Pfeil, Kirsten Isensee, Katherina Schoo, Pieter Provoost and Oliver Clements

[abstract] In 2015, the United Nations adopted the 2030 Agenda and a set of Sustainable Development Goals (SDG), including a goal dedicated to the ocean, SDG 14. The Intergovernmental Oceanographic Commission (IOC) of UNESCO was identified as the custodian agency for the SDG Indicator 14.3.1 (“Average marine acidity (pH) measured at agreed suite of representative sampling stations”). In this role, IOC is tasked with the collection of relevant data from Member States. An IOC survey identified challenges with the data and metadata handling, management of sub-variables of the EOV Inorganic Carbon within many NODCs and ADUs in the IODE network. To facilitate data submission and overcome current obstacles, IOC and its IODE programme set up the SDG 14.3.1 Data Portal, a tool for the submission, collection, validation, storage and sharing of ocean acidification data and metadata according to the SDG 14.3.1 methodology. Aligned with the FAIR Guiding Principles, current activities by IOC, experts from IODE, PML Applications, UiB and ICOS aim to integrate data and metadata by enhancing the interoperability and establishing a federated data integration/ingestion system using ERDDAP services with standardized controlled vocabulary lists fit for purpose for SDG 14.3.1. The short term aim is to achieve interoperability of different major data data providers in the field (e.g. SeaDataNet, EMODnet, NOAA NCEI, ICOS) and relevant data products (e.g. SOCAT, GLODAP, EMODnet), while in the long-term it is envisaged to include the entire IODE network of NODCs/ADUs. This presentation will highlight challenges, solutions and introduce SDG 14.3.1 as an applied showcase on how interoperable and coordinated services for data and metadata can enhance collaboration, fulfil national contributions towards the Agenda 2030 and move towards easier and more transparent data submissions procedures that are fit for purpose.

10:28 – 10:37 [on-site]
[49]SeaDataNet – further upgrading and improving FAIRness of the SeaDataNet CDI Data Discovery & Access service
Dick M.A. Schaap
, Michèle Fichaut

[abstract] SeaDataNet is a pan-European infrastructure for managing marine and ocean data and its core partners are National Oceanographic Data Centres (NODC’s) and oceanographic data focal points from 34 coastal states in Europe. Currently SeaDataNet gives discovery and access to more than 2.65 million data sets for physical oceanography, chemistry, geology, geophysics, bathymetry and biology from more than 850 data originators. The population is increasing steadily in cooperation with and involvement in many associated EU projects and major EU initiatives such as EMODnet (in particular EMODnet Chemistry, Bathymetry, Physics, and Ingestion).The infrastructure has been set up in a series of EU core and associated projects since the early 90’s. A major upgrade was made in the EU ‘SeaDataCloud’ project from Nov 2016 till middle of 2021. Upgrading included adoption of the “cloud” via a strategic and technical cooperation of SeaDataNet with EUDAT consortium of e-infrastructure service providers. This is also of relevance for the ambitions of SeaDataNet to play a role in the major initiative of EU DG RTD for establishing a European Open Science Cloud (EOSC). A major upgrade concerned the CDI Data Discovery and Access service that provides users access to marine data from > 110 connected data centres. The latest version maintains a cloud-based data cache which is synchronised by online replication from data centres. This system now facilitates data centres to submit and check themselves new submissions by batches of CDI metadata and associated data sets, while additional checks are performed by cloud-based algorithms. Moreover, capability for handling several new data types, glider and HF-radar data, was added. Since the launch, further developments have taken place for further improving the FAIRness of the system, inter alia by enriching metadata, using linked data, and developing several machine-to-machine services for machine-actionability, such as SparQL, APIs and web services.

10:37 – 10:46 [on-site]
[50]Innovations in remote and autonomous ocean data acquisition systems to enable the digital twin of the ocean
Louis Demargne
, Lex Veerhuis, Ivar de Josselin de Jong and Hugh Parker

[abstract]As one of the beneficiaries of the ocean’s shared resources, private industry has much to gain from a healthy ocean, therefore has a responsibility to help ensure its longevity. Understanding the ocean and oceanic processes is critical to mitigating development risks and recognising societal benefits in a sustainable ocean economy. A vital starting point to achieve this is to build a digital twin of the ocean, including the seafloor, as a key building block for a number of scientific studies on ocean processes and resources. Geo-data technologies and solutions that support ocean mapping and ocean observations are now advancing at an unprecedented pace. Innovations involving remote operations, autonomy, robotics, artificial intelligence, and cloud automation are key enablers to achieve these ambitions. These core technologies are already starting to change the way ocean science data and information are collected and shared.
This paper focuses on the next step in this technological evolution towards acquiring bathymetry and other ocean science data from uncrewed and autonomous vehicles operating remotely for extended periods. We will discuss the achievements to date and ongoing challenges of operating uncrewed surface vehicles (USV) equipped with a broad range of sensors and remotely operated vehicles (ROV) and the associated cloud-based digital workflows for bathymetric surveying and subsea inspection. Both the USV and the ROV are controlled “over the horizon” via a secure satellite connection to remote operations centres (ROC) on land, where the USV crew, ROV pilots and data processors are based. These proven solutions have a multiplying effect that will accelerate the development of a digital twin of the ocean. Not only do these innovations contribute meaningfully to the UN Ocean Decade goals, they also allow the maritime industry to reduce carbon emissions and improve safety by removing crew from the hostile offshore environment, among other benefits.



10:46 – 10:55
[51]Novel software for oceanographic cruise planning, execution and results database management
Sharon Z Herzka
, Saúl Delgadillo-Rodríguez, Carmina Llamas, Paula Ramírez

[abstract] Oceanographic expeditions generate discrete hydrographic, biogeochemical, and biological data from water column, sediments and community samples that reflect a plethora of analyses conducted in multiple laboratories and institutions and numerous variables. Their integration into a centralized searchable and interoperable database that complies with international standards (units and metadata) is necessary for ensuring the permanence and access of the data. It is also conducive to interdisciplinary research and the application of novel statistical methods. The main challenges are achieving consensus on data formats, and the selection of the metadata to be included and standards to be implemented. Once consensus is achieved, a precisely defined capture process that ensures the quality of data and metadata must be implemented. Faced with the need to integrate the results of dozens of cruises performed by the Gulf of Mexico’s Research Consortium (CIGOM) financed by Mexico’s CONACYT-SENER Hydrocarbon fund (2015-2022), we developed a software package with an online interface that (1) supports detailed expedition planning, (2) allows for real-time documentation of the outcome of a cruise or monitoring plan, (3) generates cruise reports including detailed sample inventories, (4) couples discrete hydrographic data from CTDs with sample inventories, and (5) implements a common data structure for storing a wide variety of data ranging from hydrography to biogeochemical parameters, rate measurements, contaminants and community composition following the classification of WORMS. The database allows for each data point to be linked to the sample’s origin (i.e. cruise and sample metadata) as well as authorship metadata. The database has an online user-friendly interface that allows for targeted searches with a wide variety of filters and download formats. The software can be easily adapted to capture data from past or future expeditions independent of the original data format and could provide a valuable tool for data management and sharing.

10:55 – 11:04
[52]Towards a comprehensive, FAIR, ocean biogeochemical data product system
Nico Lange
, Toste Tanhua, Benjamin Pfeil, Siv Lauvset, Are Olsen, Dorothee Bakker, Björn Fiedler, Annette Kock, Henry Bittig, Reiner Schlitzer and Arne Kötzinger

[abstract] Synthesis ocean data products tailored around specific biogeochemical (BGC) Essential Ocean Variables (EOV) have the potential to greatly improve today’s BGC ocean data usage and implement the FAIR principles. The products constitute key outputs from the Global Ocean Observation System, laying the observational foundation for several climate- and ocean health related end-user information and services.Here, we describe and assess all existing BGC EOV data products: the Global Ocean Data Analysis Project (GLODAP) and the GEOTRACES intermediate products with full depth water column data from research expeditions; the Surface Ocean CO2 Atlas (SOCAT) with in-situ surface ocean fCO2 (fugacity of carbon dioxide) measurements from ships, moored stations, autonomous and drifting surface platforms; and the MarinE MethanE and NiTrous Oxide product (MEMENTO). We systematically assess the product’s maturity using the Framework of Ocean Observing readiness level concept for data management and information products. To this end the data flow- and policy, interoperability, front-end and impact of the products are evaluated.Recognizing that the importance of adequate and comprehensive data from the EOVs will grow, our vision is to use this assessment to sustainably support the activities on these data products individually and to obtain one overarching cross-platform FAIR BGC data management system.

11:04 – 11:10 Q&A
11:10 – 11:20 Health break

11:20 – 11:29
[53]New products and services based on HF radar data in the RAIA Observatory (NW Iberian Peninsula)
Silvia Piedracoba
, Silvia Allen-Perkins, Garbiñe Ayensa, Enrique Álvarez-Fanjul, Ana Basañez, Alexandre Costa, Carlos Fernandes, Miguel Gilcoto, Pablo Lorente, Jose Matos, Pedro Montero, Lino Oliveira, Vicente Pérez-Muñuzuri, Waldo Redondo, Gabriel Rosón, Mª Isabel Ruíz, Catarina Ribero, João Seca, Silvia Torres, Ramiro Varela, Marta Vázquez, Begoña Vila and Jose Ramón Viqueira

[abstract] Objective: RAIA Observatory is a cross-border ocean observing and forecasting system at the coast and shelf of the NW of the Iberian Peninsula (Spain – Portugal). This observatory was initiated in 2009 and is in constant evolution and facing new challenges. Several European projects have enabled the RAIA Observatory to develop and increase its capabilities to support coastal communities. New user-oriented products have been developed to exploit the potential of the HF radar technology in the framework of RADAR ON RAIA project. Materials and Methods: Homogenization of maintenance procedures, standardization of data formats and their accessibility as well as data quality monitoring in near real-time. Development of a spatial data infrastructure, compatible with the Observatory.
Relevance to data exchange: These new user-oriented products have been implemented locally for the NW Iberian Peninsula giving support to aquaculture and fisheries (upwelling index) or marine renewable (HF Wave product) sectors. All of them can be easily transposable to similar regions like California, Perú and Benguela. Results: Development of products and services to support an efficient management of the marine environment based on the HF radar network of the RAIA Observatory. A wide range of products are being obtained involving all the variables derived from HF radar technology. At a starting point, we are implementing three specific products: 1) upwelling index derived from HF radar-derived surface currents; 2) operational observation of waves from HF radar stations; 3) combined product consisting of daily averages calculated from HF Radar and buoy winds, and daily satellite data (SST and Chl a L4 products) served by Copernicus. Conclusions: The cross-border RAIA Observatory allows a coordinated transnational response regarding ocean observations giving support to decision makers in key maritime sectors which will contribute to the development of a sustainable Blue Economy.

11:29 – 11:38
[54]Climatological distribution of dissolved inorganic nutrients in the Western Mediterranean Sea (1981-2017)
Malek Belgacem
, Katrin Schroeder, Alexander Barth, Charles Troupin, Bruno Pavoni, Patrick Raimbault, Nicole Garcia, Mireno Borghini and Jacopo Chiggiato

[abstract] Ocean life relies on the loads of dissolved inorganic nutrients and other micro-nutrients into the euphotic layer. They fuel phytoplankton growth that maintains the equilibrium of the food web. Ocean circulation and physical processes continually drive the large -scale distribution of chemicals toward a homogeneous distribution. The biological/biochemical processes counteract this tendency. Therefore, describing nutrient dynamics is important to understand the overall ecosystem functioning. At global scale, most of the biogeochemical descriptions are based on model simulations and satellite data, since nutrient observations are generally infrequent and not homogeneously distributed in space and time. Climatological mapping is often used to understand the biogeochemical state of the ocean representing monthly, seasonally or annual averaged fields. Within this context, the Western MEDiterranean Sea BioGeochemical Climatology (BGC-WMED) presented here is a product derived from in situ observations, derived from various data sources: in total, 2253 in-situ inorganic nutrient profiles over the period 1981-2017 have been used; the CNR-WMED biogeochemical dataset, SeaDataNet, MOOSE data. Mean gridded nutrient fields for the period 1981-2017, and sub-periods 1981-2004, 2005-2017, on a horizontal 1/4° × 1/4° grid have been produced. The climatology is built on 19 depth levels and for the inorganic nutrients nitrate, phosphate and orthosilicate. To generate smooth and homogeneous interpolated fields, an advanced N-dimensional version of DIVA, DIVAnd v2.5.1, has been used.
A sensitivity analysis was carried out to assess the comparability of the data product with the observational data. The BGC-WMED has then been compared to other available data products, i.e. the medBFM and the WOA18. This work has been accepted for publication in ESSD https://doi.org/10.5194/essd-2021-149 and the BGC-WMED data product is available on PANGAEA https://doi.pangaea.de/10.1594/PANGAEA.930447.

11:38 – 11:47
[55]EMODnet Bathymetry – establishing the best digital bathymetry for European seas
Thierry Schmitt
, Dick M. A. Schaap and George Spoelstra

[abstract] EMODnet aims at assembling European marine data and data products from diverse sources in a uniform way. Since 2008, EMODnet Bathymetry is developing and publishing at regular intervals the EMODnet Digital Terrain Model (DTM) for European seas, each time including more observations, and improving quality, precision, and publishing. DTMs are produced from survey and composite DTMs, referenced in SeaDataNet Catalogues. SeaDataNet is a pan-European infrastructure, run by NODCs in Europe. Bathymetry data sets are gathered and populated by national hydrographic services, marine research institutes, and companies. Currently, it comprises 31.000+ survey datasets, and 240+ composite DTMs and Satellite Derived Bathymetry (SDB) products, from 70 data providers from 24 countries. SDB data fill gaps in coverage of coastal zones. EMODnet Bathymetry OGC web services are integrated in the IHO DCDB Data Viewer for global sharing. A major selection of datasets are used in generating the EMODnet DTM, while gaps are completed with GEBCO DTM data. The latest EMODnet DTM (2020 version), released end 2020, covers all European seas including part of the North Atlantic and Arctic Oceans and Barents Sea. The DTM has a resolution of 1/16 * 1/16 arc minutes (circa 115 * 115 m2), containing approx. 12.3 billion grid nodes. From all data sources, a total of 16.260 unique data references are included.
The DTM can be freely viewed by a versatile browser and downloaded, while also shared by OGC services (WMS, WFS, WCS, WMTS). Powerful 3D visualisation functionality is included. In addition, the EMODnet Bathymetry World Base Layer (EBWBL) OGC WMTS service was launched, covering a standard grid resolution for topography and bathymetry for the whole world as a combination of EMODnet DTM, GEBCO 2019, and a satellite derived DEM for landcover. The EMODnet Bathymetry contributes actively to the Seabed 2030 initiative of IOC and IHO.


11:47 – 11:56
[56]EMODnet Seabed Habitats: collecting habitat maps once, using many times
Helen Lillis, Eimear O’Keeffe, Aldo Annunziatellis, Sabrina Agnesi, Harriet Allen, Lewis Castle and Jordan Pinder

[abstract] EMODnet Seabed Habitats hosts the largest European collection of habitat maps from individual surveys and survey-based sample points. There are nearly 1,000 habitat maps from surveys and nearly 500,000 sample points, freely available to view via the online interactive map or by WMS. Of these, around 900 habitat maps and 355,000 sample points are also freely available to download from the website or by WFS. This compilation of polygons and points, with standardised attributes and metadata, presents a great opportunity to create new products which aim to answer specific questions, such as ‘what is the current known extent of habitat X in region Y?’. For Europe, we have produced these so- called ‘composite data products’ for the Seagrass cover, Macroalgal canopy cover and Live hard coral cover Essential Ocean Variables (polygons and points) and for Biogenic substrate (polygons only). The procedure is always similar: Identify the most complete sources of spatial data on seabed habitat types in the target region.For the target habitat(s), search the most commonly used classification systems for habitats/biotopes/biocoenoses that describe the target habitat(s). Extract data from the sources identified in (1) and filter to select only the habitats/biotopes/biocoenoses identified in (2). Apply some rules to remove overlapping information (if needed). Compile all data into a single data product, making sure the provenance of every polygon and point is clear in the attribute table. The end products are only as good as the data that feeds them and there are currently data gaps, but in producing these products we have demonstrated the added value of compiling and standardising seabed habitat maps and sample point data. EMODnet Seabed Habitats are always searching for new habitat maps and sample points to grow the collections and welcome contributions from anywhere.


11:56 – 12:05
[57]OpenOceanCloud
Ryan P Abernathey
, Chelle Gentemann

[abstract] For decades, sparse data constrained oceanographic research, but now we are drowning in a flood of data. Oceanography has been revolutionized through the development of new observing technologies (eg. satellites, environmental DNA, acoustics, autonomous floats, and gliders) that deliver vast amounts of data every day. Alongside these observations, a suite of numerical models has emerged which simulate the ocean with ever increasing detail and realism. To meet this challenge, we need an international collaboration to accelerate the development of cloud-based data infrastructure for oceanography: OpenOceanCloud. This partnership between universities, research institutes, and industry will leverage open data, open-source software, and cloud computing to build a collaboration platform that can be used by students and researchers across the world, with an emphasis on broadening participation in research. OpenOceanCloud builds on the very real and concrete foundations of Pangeo, an international community and software / infrastructure platform for big data geoscience. The Pangeo project has successfully brought over 1 Petabyte of open, public CMIP6 data to the commercial cloud and deployed data-proximate computing solutions that have transformed climate research. A core value of the Pangeo project was to build with the open source community, using and extending existing, community accepted- tools. We now propose to bring this approach to bear on Oceanography, creating a global federation of data sites and computing solutions using open standards and cloud-native open-source technology. The OpenOceanCloud cloud approach enables interdisciplinary, collaborative research using large, complex datasets, supporting discoveries in key areas such as ocean prediction, air-sea interaction, and climate-change impacts. At the same time, this platform is potentially one of the most transformative tools for environmental justice, empowering local communities with the data and access they need to amplify their goals globally. OpenOceanCloud will realize the dream of the “data commons” that the Ocean Decade requires.

12:05 – 12:14
[58]A cloud-based tool for standardized and integrated oceanographic data access: A CCADI use case for ocean acidification key variable collections in Baffin Bay, Canada
Claire Herbert
, Tahiana Ratsimbazafy, Yanique Campbell, Tonya Burgers, Pascal Guillot and Tim Papakyriakou

[abstract] The Canadian Consortium for Arctic Data Interoperability (CCADI) is working towards the development of a portal for Arctic data sharing, discovery, and the production of data integration and analysis tools. CCADI use case three is focusing on creating tools to simplify the processing of data required to measure ocean acidification in Baffin Bay, Canada. Key variables related to climate and carbon cycle change are selected to build a standardized, interoperable dataset. The source datasets, however, are heterogeneous and unstandardized. Collecting and curating such data requires significant time and effort from researchers. This use case aims to provide a simplified tool to retrieve and analyze data in the cloud, without having to process the raw data locally. The oceanic variables used are collected from a Rosette lowered from the CCGS Amundsen at different locations to create vertical profiles. Laboratory analysis of water samples provides nutrient data, as well as Oxygen 18, Total Alkalinity and Dissolved Inorganic Carbon (DIC). A Seabird CTD attached to the Rosette produces vertical profiles data in a proprietary format (BTL) providing Practical Salinity, Temperature and Depth. The combined dataset is analyzed in the cloud using the program CO2SYS to produce carbonate system variables, after which satellite Sea Ice Concentration (SIC) products from Canadian Ice Service (CIS) and Bremen University are integrated. The final dataset can be visualized on a map before being downloaded by the users as a csv or shapefile. Users can also download the sea ice maps in raster format. This approach would considerably reduce the time and effort in data curation and processing, allowing for greater focus on the scientific interpretation and application of the data. The tool produced here will be part of the new architecture under development in the CCADI project.

12:14 – 12:20 Q&A
12:20 – 13:30 Lunch

13:30 – 13:39 [on-site]
[59]SeaDataCloud data products and value chain through a collaborative approach
Simona Simoncelli
, Christine Coatanoan, Volodymyr Myroshnychenko, Örjan Bäck, Helge Sagen, Serge Scory, Paolo Oliveri, Gelsomina Mattia, Kanwal Shahzadi, Nadia Pinardi, Alexander Barth, Charles Troupin, Reiner Schlitzer, Michèle Fichaut and Dick M.A. Schaap

[abstract] SeaDataCloud (SDC) data value chain aims at generating data products, whose quality reflects the coordination capacity in managing multidisciplinary in situ data according to FAIR principles. The data from different providers are integrated and harmonized thanks to a shared quality assurance strategy and quality control methodologies conducted at various stages of the data value chain adopting and continuously refining community software and tools (i.e. ODV, DIVAnd). The goal of SDC products’ team was to release the best aggregated datasets, derived climatologies and new products (i.e. Ocean Heat Content, Mixed Layer Depth) for the EU marginal seas and serve diverse user communities. The aggregated datasets contain all temperature and salinity observations harvested from SeaDataNet infrastructure. The SDC Quality Assurance Strategy is iterative and consists of four phases: data harvesting, files/parameters aggregation, quality check and feedback to data providers on data anomalies. It allows to continuously improve the quality of the database content and derive enhanced versions of data products. The systematic analysis of the regional datasets shows a progressive increase of the available data and their quality. A novel metadata analysis allow to monitor the EU data sharing landscape, to detect systematic errors (format, flagging) and data/metadata omissions. The SDC regional climatologies were designed with a harmonized approach to integrate for the first time SDC aggregated datasets with external sources (WOD, CMEMS). For the global ocean, the dataset used is WOD18, a novel quality control procedure has been developed and climatological ensemble estimates have been developed.
Products are available through the SDC web catalogue accompanied by Product Information Documents (PIDoc) containing all specifications about product’s generation and usability to facilitate users’ uptake. Digital Object Identifiers (DOI) are assigned to both products and PIDocs to foster transparency of the production chain and acknowledge all actors involved from data originators to product generators.

13:39 – 13:48
[60]EMODNET Geology – harmonizing geological data of the European seas and beyond
Susanna Kihlman
, Herny Vallius and EMODnet Geology partners

[abstract] Proper maritime spatial planning, coastal zone management, management of marine resources, environmental assessments and forecasting require comprehensive understanding of the seabed. In response to the needs already in 2008, the European Commission established the European Marine Observation and Data Network (EMODnet), which is now in its fourth phase. The EMODnet concept is to assemble existing but often fragmented and partly inaccessible marine information into harmonized, interoperable, and publicly freely available data layers encompassing whole marine basins. As the data and data products are free of restrictions on use, the program is supporting any European maritime activities in promotion of sustainable use and management of the European seas. Now in its fourth phase, the EMODnet-Geology project is delivering integrated geological data products that include seabed substrates, sediment accumulation rates, seafloor geology including lithology and stratigraphy, Quaternary geology and geomorphology, coastal behavior, geological events such as submarine landslides and earthquakes, marine mineral resources, as well as submerged landscapes of the European continental shelf at various time-frames. All new map products are presented at a scale of 1:100,000 all over and even more detailed where the underlying data permit. A multi-scale approach is adopted whenever possible. However, the EMODnet concept is not restricted to the European seas only, as also the Caspian and the Caribbean Seas are included in the geographical scope of the EMODnet Geology project, and selected methods are shared with the EMODnet PArtnership for China and Europe (EMOD-PACE) project. The EMODnet Geology project is executed by a consortium of 40 partners which core is made up by 24 members of European geological surveys (Eurogeosurveys) backed up by 16 other partner organizations with valuable expertise and data.

13:48 – 13:57
[61]Cal/Val web-based applications for the Mediterranean and Global Ocean Forecasting Systems
Vladyslav Lyubartsev
, Nadia Pinardi, Emanuela Clementi and Simona Masina

[abstract] The multi-functional web-based application named Cal/Val, that automatically monitors model skill metrics, has been developed in order to validate operationally ocean model results with in situ observations. Currently, the application deals with two advanced operational oceanographic models run at CMCC (https://www.cmcc.it/):
• the Mediterranean Forecasting System (MedFS, https://medfs.cmcc.it/) run operationally in the framework of the Copernicus Marine Environment Monitoring Service (CMEMS, https://marine.copernicus.eu/),• the Global Ocean Forecast System (GOFS16) implemented at CMCC since early 2017. MedFS Cal/Val (https://evalid.cmcc.it/evaluation/calval/) validates model results with respect to in situ fixed moorings observations derived from EMODnet Physics data portal (https://portal.emodnet-physics.eu/) providing time series of:sea level, temperature, salinity, and ocean currents. GOFS16 Cal/Val is devoted to the evaluation of vertical profiles provided by Argo buoys. It represents RMSE and bias of 3D temperature and salinity using misfits with ARGO-CTD collected from CMEMS INS TAC. Statistics are calculated on-the-fly for the selected time intervals, depths, and sub- regions. Currently, Cal/Val is designed on the client/server architecture. The server part includes web and database servers. The SQL database is updated operationally with in situ and the corresponding modelling values. The client part (browser) is implemented on JavaScript. Client-server interaction uses JSON data-interchange format. Not only can Cal/Val applications present multiple in situ and model datasets, but also both model evaluation and model inter-comparison are possible.

13:57 – 14:06
[62]Emerging Data Management Practices and Infrastructure for the Canadian Integrated Ocean Observing System
Reyna Jenkyns
, Jonathan Pye and Pauline Chauvet

[abstract] The Canadian Integrated Ocean Observing System (CIOOS) launched in 2019, following a decade of initiatives seeking nationally coordinated ocean observations. Over the last few years, CIOOS established a national system with three Regional Associations (Atlantic, St. Lawrence, Pacific) and governance structures for coordination and expert inputs including Executive, Science and Technical Committees. Within the Technical Committee, there are task groups with regional representation for biological data, data management plans, metadata, model data, solution architecture, user experience and visualization. We will highlight the data management progress made and challenges experienced during multi- stakeholder coordination at CIOOS. At the conclusion of the pilot phase in 2020, data management approaches were established for the initially prioritized Essential Ocean Variables (EOVs) of temperature, salinity, oxygen, water level, currents, and nutrients. The digital infrastructure includes a network of CKAN metadata catalogues using a common ISO 19115 metadata profile, ERDDAP services for CF-compliant datasets, and web portals with basic information, an asset map and regionally customized data products. During this second phase ending March 2022, the EOVs have expanded to include all the GOOS EOVs. Recent efforts have focussed on determining data management approaches for Ocean Biodiversity Information System (OBIS) contributions, model outputs, marine debris and Global Telecommunications System data pipelines. Decisions on metadata and data schemas, controlled vocabularies and data granularity are key factors for all of these data types. Exploratory work and partnerships for indigenous data governance are underway, moving towards improved attribution and recognition of sensitive data concerns. A metadata entry form and a data exploration portal that leverage earlier infrastructure have recently been developed. While these data management achievements have already increased the FAIRness of ocean data in Canada, the CIOOS stakeholders and coordination frameworks will continue to provide the foundation for further advancements as we embark on the UN Ocean Decade.


14:06 – 14:15 [on-site]
[63]Oceanographic data and information system for Polish NODC initiative
Marcin Wichorowski
, Krzysztof Rutkowski, Michał Wójcik, Lena Szymanek, Urszula Pączek and Mirosława Ostrowska

[abstract] Polish scientific consortium consisting of Institute of Oceanology Polish Academy of Sciences, Polish Geological Institute National Research Institute, National Marine Fisheries Research Institute, University of Gdańsk, Maritime Institute of Maritime University in Gdynia, Pomeranian Academy in Słupsk and University of Szczecin, being involved in marine research for a long time have consolidated efforts and undertook actions to make Polish oceanographic scientific data resources accessible for public from one national repository. These organizations established good international cooperation, being members of international projects, organizations, and initiatives provide data to international data and information centers and systems so far. In order to increase Polish contribution to pan European oceanographic data resources they work together to establish the structure of the Polish National Oceanographic Data Committee and ODIS infrastructure for the future cooperation. Consortium has successfully submitted project proposal in the frame of the Digital Agenda Poland program and has been awarded for the project eCUDO.pl (Oceanographic Data and Information System). This project aims to harmonize Polish oceanographic data, make them interoperable through implementation of committed standards for information structure and communication protocols. The project is developing and deploying ODIS as distributed infrastructure for data management, and providing open access to oceanographic data resources. Present activities encompass harmonization of environmental data collection and its preservation in accordance with INSPIRE requirements and SeaDataNet standards, securing resources for data management and stewardship, as well as digitalization of hardcopy data archives. The most significant results as expected, are better data discovery findability, accessibility, interoperability and finally higher potential for reuse of data collected during the years of research activity. Design of the system according to the program requirements is user oriented and driven by development of services demanded by users. This ODIS is open for all stakeholders and ready to accommodate other organizations and data resources.

14:15 – 14:24 [on-site]
[64]SatBałtyk System, the data sharing platform and a modern tool for the Baltic Sea monitoring
Mirosława Ostrowska
, Mirosław Darecki, Adam Krężel, Dariusz Ficek and Kazimierz Furmańczyk

[abstract] The SatBałtyk system is both an innovative research tool that meets the requirements of modern oceanography and a platform providing environmental data. It was developed and deployed by the Consortium associating four scientific institutions: Institute of Oceanology Polish Academy of Sciences (coordinator), University of Gdańsk, Pomeranian Academy in Słupsk and University of Szczecin. The SatBałtyk System uses satellite remote sensing data along with traditional oceanographic measurements supplemented by ecohydrological modelling to provide nowcast environmental information on the condition and the Baltic Sea ecosystem trends. Such synergy of heterogenous data sources is the only way of satisfying the needs of comprehensive high quality, continuous and near real-time monitoring of the marine environment. This required theoretical foundations, a complex research,IT infrastructure, and operational procedures in place. As the result, SatBałtyk System has been monitoring and providing through the www.satbaltyk.pl website variety of structural and functional properties of the Baltic Sea ecosystem since 2015.These parameters are divided into eight task-oriented groups: Atmosphere, meteorology; Hydrology; Ocean optics; Radiation budget; Sea water components; Phytoplankton, photosynthesis; Coastal zone and Hazards. The spatial and vertical coverage of the system extents to entire Baltic. Open access data are provided as digital maps , graphs or they can be downloaded. The system also provides historical data with a short forecast of some modelled parameters. This collection of data describing the Baltic ecosystem can not only serve for scientific analyses but can also support decisions regarding the economy, management and protecting of its resources. The next step is to facilitate access to this data for a wide group of stakeholders. Resources of the SatBałtyk System will boost Oceanographic Data and Information System, eCUDO.pl (project founded by EU POPC.02.03.01-IP.01-00-0062/18).

14:24 – 14:30 Q&A
14:30 – 14:40 Health break

2.5       Leaving no one behind: the need for ocean data and information capacity development and IOC’s role

14:40 – 14:49
[65]Role of Invemar’s Regional Training Center on Ocean and Information Capacity Development for Spanish Speakers Community
Paula Cristina Sierra-Correa
, Francisco A. Arias-Isaza

[abstract] Capacity development (CD) starts from the principle that people are best empowered to realize their full potential when them has the knowledge & skills to understand the processes. CD is an essential principle of IODE-IOC-Unesco; in this sense she has established Ocean Teacher Global Academy as a global network of Regional and Specialised Training Centres (RTCs & STCs). INVEMAR as a Regional Training Center in marine sciences for Spanish speakers in Latin America and the Caribbean (LAC) since 2014, provides personalized training for professionals and postgraduate students to increase capacity in ocean & coastal knowledge, services and management. Training courses are ruled by quality standards (ISO 29990: 2010). The training needs identified by RTC correspond to national and international commitments within the CD Strategy of IOC-Unesco and gaps on training identified in the Global Ocean Science Report, as well as the Conventions training needs (Biological Diversity, Climate Change Convention under article 6, and RAMSAR Strategic Plan 2016-2024 related to coastal & marine wetlands). The constant interactions with IOCARIBE, CPPS, LME’s & projects in the LAC Region contributes to this, requesting the priorities that they have identified. During the period 2014 – 2021 RTC-Invemar developed more than 40 courses, training around 1000 people on topics like GIS, MPAs, ICZM-MSP, Blue Carbon, indicators of ODS 14, Data Science, among others. Pandemic year was very challenging for all the World including the RTC-Invemar, however only her achieving 5 of the 9 OTGA virtualized courses. For the next decade, RTC will include new initiatives focused on the 6 societal outcomes of the UN Decade of Ocean Sciences 2021-2030. Communications strategies will be implemented in order to establish new alliances to open offers of learning services in different formats (face-to-face, virtual, or hybrid) which will allow a more ambitious reach for the OTGA in the region.


14:49 – 14:58
[66]Data Inclusivity and stewardship in blue resources sustainability in the decade of action
Isa Elegbede Olalekan

[abstract] Data from Blue resources are derived from aquatic plants, animals and the human dimensions associated to the water systems. Despite the vast volume of blue resources on earth, there are still issues with poverty, hunger, mismanagement of these resources, caused mainly by uncoordinated management and unsustainable practices and adequate data stewardship. This study aims at proposing inclusive approaches through a market- based sustainability framework to promote aquatic resource data management. The process adopted was the collaborative inclusivity approach model and the theory of change models to direct the operability of usecases of ocean data. The outcome of this study reveals that for an adequate blue economy to be achieved globally, it should incorporate the “no one should be left out” approach through active and participatory data collection and management through involvement in decision making process. Thus, it should also synergistically contain the combination of bottom-top and top-bottom approaches of data collection and utilisation for the common resources users. Although the theory of change aspect of the study reveals possible externalities and unfavourable circumstances that would affect the attainment of desired growth, such as political and economic situations along the value chain and natural disasters that we mainly beyond the control of the actors. These outcomes align with the SDGs decade of action requirement that to effectively achieve sustainable resources management, inclusivity is crucial for growth through the appreciation of data value chain.

2.6       Data Science: scientific insight through data management

14:58 – 15:07
[67]How the marine data management at European scale can provide quality datasets to evaluate marine litter issues and contribute to the improvement of the existing monitoring processes
Matteo Vinci, Maria Eugenia Molina Jack, Alessandra Giorgetti, Alexia Cociancich, Alessandro Altenburger, Elena Partescano, François Galgani, Amandine Thomas, Erwann Quimbert and Morgan Le Moigne

[abstract] Introduction and Objective: Marine litter is a global concern. EMODnet Chemistry is a unique regional scale initiative that has been working on the management of EU marine litter data since 2017. The objective is the setting- up of a data management workflow, to provide standardized and quality information. Materials and Methods :The initial phase to analyze the state of the was crucial to identify the consolidated communities and best practices and to set the strategies to manage diverse litter data types. The workflow was focused on the standardization and interoperability of data, a vigorous pre-ingestion quality control procedure, and a set of post ingestion analysis. The result of this work are good quality datasets provided to users and stakeholders in different ways: raw data, data collections and maps. Relevance to data exchange:Great boost to the activity of data collection, standardization, and quality control was provided by the close collaboration with the MSFD-TGML group and JRC, EU Member States and the relevant National Oceanographic Data Centres. This led to the first EU beach litter baselines and now this is ongoing also for seafloor trawlings and floating microlitter.
Data from citizen science and research are also integrated, enabling a harmonized inclusion of multiple data sources. Results :EMODnet Chemistry data management workflow provided several added values. The standardization and quality control deliver a single EU dataset, easily accessible in a unique format, with a uniform level of quality. The creation of specific data outputs has enabled the definition of thresholds for beach litter as an operational target for policy makers and environmental managers. The numerous maps generated provide a comprehensive view of the distribution of marine litter in Europe. Conclusions :Marine data management at national, regional and global scale is a key point to provide an integrated insight that helps to tackle global concerns like marine litter.


15:07 – 15:16
[68]Marine biodiversity advances in a digital era
Hanieh Saeedi

[abstract] The transmission of data and information is not always a transparent process. It depends on a series of assumptions about shared understanding of words and concepts. The wider the user communities of data are, the less likely that those assumptions will be fully met, and the greater the need to work to ensure that understanding is met. One example of this could be the language barrier. Information and data generation could also create storage problems for many organizations and sometimes is difficult to allocate funding for maintained of those storages. All aspects of the availability and transfer of data, information and knowledge, entail costs of some sort for someone, and this can become a real issue when resources are limited. Meanwhile the lack of strategies for addressing the demands of stakeholder communities for fit-for-use biodiversity data in particular is an issue. To address some of the challenges mentioned above, digital data (or Digital Specimens) aim to create digital-only workflows that facilitate digitization, curation, and data links, thus returning value back to physical specimens by creating new layers of annotation and developing automated approaches to advance marine biodiversity discoveries and conservations. Mass-digitization and data sharing efforts will not only facilitate the large-scale biodiversity studies, but also the fundamental questions of what are the main drivers of biodiversity patterns and how climate change might shift the distribution ranges of deep-sea species in future. These are fundamental knowledge and essential information to provide any science-policy reports in order to inform the decision makers towards establishing management and sustainable solutions.

15:16 – 15:25
[69]Which ocean data do we need to develop a forecasting system for shellfish safety?
Pedro Reis Costa
, Susana Rodrigues, Sónia Pedro and Marta Belchior Lopes

[abstract] The marine environment provides a range of ecosystem services and benefits, including the provision of protein food sources. Shellfish harvesting responds to the increasing demand for seafood products and contributes to the economic sustainability of coastal regions. However, shellfish may act as vectors of contaminants to humans. Harmful algal blooms (HABs) are responsible for shellfish contamination by biotoxins, and urban and agricultural runoffs, highly influenced by rainfall and floods, are the main drivers of microbial contamination. To minimize the risk of acute intoxications, most coastal countries carry out an Official Control of their shellfish producing areas by weekly determining marine toxins and E. coli levels in shellfish and assessing the presence of toxic phytoplankton species in seawater. Many of these programs are running for over 2 or 3 decades, making the official control data, collected at regular time intervals, a promising time-series database for predicting shellfish contamination based on past observations. Moreover, coupling available large datasets from several other sources, including remote sensing products, with historical environmental parameters obtained from the routine environmental survey of the shellfish producing areas, can potentially be useful to forecasting models that may guide management actions. Still, data from official control of shellfish contamination remains difficult to access, as well as data on phytoplankton abundance in coastal areas. Also, abiotic parameters, such in-situ measurements of temperature, salinity, and ocean currents, are not easily available. The strategy we are implementing is based on integrating dispersed historical time-series data from all sources to develop a web platform integrating data modelling and visualization tools to predict shellfish contamination by time and harvest zone in the Portuguese coast.

15:25 -15:34
[70]Toward a Fully Automated NonLinear Quality Control of Temperature and Salinity Historical Datasets For Ocean Climatology
Kanwal Shahzadi
, Nadia Pinardi and Antonio Navarra

[abstract] A robust quality control procedure is a prerequisite in the computation of large-scale ocean climatologies based on historical in situ datasets. In this study, a semi-automatic Nonlinear Quality Control (NQC), similar to Jia et al. (2016), is developed entailing the information of the intrinsic nature of uncertainties of historical in-situ observations in terms of representativeness errors and outliers. The three essential steps of the NQC are: (1) the subdivision of the study domain, (2) the construction of a first guess quality control, and (3) the iterative nonlinear quality control. The first guess quality control is somehow similar to the traditional approach, where a threshold is set on the second order moment of the statistical distribution of in-situ observations in each subdomain to eliminate the most prominent outliers. The third step of NQC is iterative and uses the results of the previous step to advance. It uses again a threshold to eliminate the observations by considering the second moment of a gridded estimate from observations in each subdomain. At each step of this procedure, the outliers will be eliminated until convergence, i.e., no more observations are eliminated and there are no outliers in the remaining data set. The K- Mean clustering and a new mapping technique i.e., Data Interpolating Variational Analysis (DIVA) are used in the new NQC scheme. The NQC procedure has been tested on the Northwest Pacific, North Atlantic and South Atlantic for different depths and months with three different domain subdivisions: a regular subdivision of the domain, a regime-oriented division using a knowledge-based approach and applying K- Mean clustering. The results show that NQC effectively rejects the outliers using the regime-oriented subdivisions both with K-Mean or knowledge- based principles. The proposed NQC has the potential to be applied to large historical ocean datasets and on the global ocean domain to compute the reliable estimate of large-scale ocean climatologies.

SESSION 3: LOOKING FORWARD

3.1       Converging on multi-stakeholder best practices in data and information management (what are good practices in science, operations and how to build bridges between these domains)

15:34-15:43
[71]Enhancing FAIR in situ data delivery: EuroGOOS recommendations for the Ocean Decade
Sylvie Pouliquen
, Dina Eparkhina, Simona Simoncelli, Julien Mader, Gisbert Breitbach, Thierry Carval, Arnfinn Morvik, Joaquín Tintoré, Patrick Gorringe, Antonio Novellino, Emma Reyes, Juan Gabriel Fernández, Marta de Alfonso, Johana Linders, Begoña Pérez Gómez, Benjamin Pfeil, Veselka Marinova, Virginie Racapé, Dick M.A. Schaap, Peter Thijsse, Susanne Tamm and Leonidas Perivoliotis

[abstract] EuroGOOS is the European component of GOOS, the Global Ocean Observing System of the Intergovernmental Oceanographic Commission of UNESCO. Since its establishment in 1994, EuroGOOS and its regional systems (five Regional Operational Oceanographic Systems in the Arctic, Baltic, North-West Shelf, Ireland-Biscay-Iberia and Mediterranean) have widely contributed to GOOS policies in sharing ocean data and co- production of oceanographic services. However, many gaps and barriers still exist and must be addressed in the UN Decade of Ocean Science for Sustainable Development 2021-2030. It is essential to improve the quantity, quality, accessibility, interoperability, and usability of marine information for decision making and enable new opportunities in the maritime sectors, for the benefit of citizens. These drivers and goals set the requirements for an improved ocean data management. To meet these needs, EuroGOOS established a dedicated working group titled Data Management, Exchange, and Quality (DATAMEQ). DATAMEQ works to: (i) conceptually guide EuroGOOS in situ data management considering the existing systems; (ii) propose solutions to unlock data for operational and research purposes; and (iii) deliver standards for data matching the principles of Findability, Accessibility, Interoperability and Reusability (FAIR). Capitalizing on national and European projects and programmes, in situ data FAIRness has significantly advanced in some domains. This includes better harmonization, interoperability, and integration of data and metadata services at regional or observing networks level. DATAMEQ has helped to improve common reference vocabularies, transfer knowledge and best practices among initiatives, and define recommendations for data providers. We will present the findings of this work and recommendations for the ocean data stakeholders, users, and providers for the Decade to come, defining priorities in terms of FAIR data for operational oceanography services and ocean research responsive to European and global policies and societal challenges.


15:43-15:52
[72]Bringing together sediment quality data from Regulatory, environmental assessment and monitoring sources to inform marine applications and provide a ‘one stop shop’ for stakeholders
Jemma Anne Lonsdale
, Claire Mason and Sylvia Blake

[abstract]The Centre for Environment, Fisheries and Aquaculture Science (Cefas), on behalf of the United Kingdom (England, Scotland, Wales, Northern Ireland) and Jersey, Guernsey, and Isle of Man collect and report annually data related to our obligations under the London Convention/London Protocol on the Prevention of Marine Pollution by Dumping of Wastes and Other Matter (LCLP) and under the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR). The respective regulators provide data on the permits that have been issued in a reporting year (January to December) as well as the amount of material and contaminants that have been disposed of to designated disposal sites within the same reporting year. The data is analysed to determine the contaminant loading that is placed at the respective disposal sites. The analysed data is submitted to the secretariates for international treaties (OSPAR and LCLP) to determine progress towards the objectives for reducing pollution into the marine environment. To be able to do this, Cefas collate sediment quality data in relation to dredge and disposal applications, primarily for England and Wales, but also Scotland and Northern Ireland, Jersey, Guernsey, and the Isle of Man. Detailed sediment quality data are invaluable in environmental assessments related to any projects which may impact sediment quality. Under a project funded by the UK’s Department for Environment, Food and Rural Affairs (Defra), Cefas have developed an online app using detailed contaminant sample data from England and Wales used to support marine licence applications, which allows users to access and interrogate the data against environmental thresholds (sediment quality) for their own assessments and provides the first step in a ‘one stop shop’ for marine licensing data. The application has been developed with stakeholder input and feedback.

15:52-15:58 Q&A
15:58-16:08 Health break

16:08 – 16:17
[73]Towards harmonization of monitoring methods and data sharing for ocean surface microplastics
Office of Policies against Marine Plastics Pollution, Ministry of the Environment, Japan
, Keiji Nakashima and Yutaka Michida

[abstract] Endorsed by the Ocean Decade, the Ministry of the Environment, Japan (MOEJ) is running a project on the development of a database on ocean surface microplastics. This project aims to form a global hub to share information and compile monitoring data of ocean surface microplastics in collaboration with existing and additional future related activities. Marine litter, including microplastics, is an urgent global issue. Understanding the existing distributions and quantities of microplastics in the ocean is a vital first step towards solving this problem. However, comparing reported microplastics abundances remains a challenge due to diverse monitoring methods. Harmonization of the monitoring methods for marine litter, including microplastics, is thus regarded as a top priority task. To overcome this challenge, MOEJ has been promoting global density distribution mapping of ocean surface microplastics, thereby revealing the state of marine pollution. A key outcome of this activity was the development of Guidelines on the harmonization of sampling and analytical methods for ocean surface layer microplastics. The work is based on the results of two projects involving international experts, which compared various sampling methods in the ocean and analytical methods in laboratories. “Guidelines for Harmonizing Ocean Surface Microplastic Monitoring Methods” http://www.env.go.jp/en/water/marine_litter/guidelines/guidelines.pdf. The next milestone following the development of Guidelines is the launching of the Database. It will provide data intended for harmonization, i.e. treated by quality control and leveling. Users of the Database can freely access and download the data and display it on a two-dimensional map to assess the impact of microplastic pollution, to develop simulation models, to plan countermeasures, and much more. The Database contains about 13,000 data from the outset, and MOEJ is seeking for partnerships and collaboration with existing and future practices to further accelerate this project.

16:17 – 16:26
[74]Digitising methodologies and catalysing best practice development and exchange: the status and future of the IOC Ocean Best Practices System in the ocean’s digital ecosystem
Pier Luigi Buttigieg
, Johannes Karstensen, Frank Muller-Karger, Jay Pearlman and Pauline Simpson

[abstract]Ocean-relevant digital assets are multiplying, diversifying, and playing an ever-more central role in understanding and sustainably managing the ocean. How these assets are generated, processed, and shared strongly influence their potential for (re)use. Thus, comprehensively documenting, digitising, and persistently archiving the methodology is a core requirement when building data systems at any scale. To this end, the Intergovernmental Oceanographic Commission’s (IOC) Ocean Best Practices System (OBPS) – co-sponsored by IODE and GOOS – aims to facilitate the digitisation, long-term archiving, and machine-driven exchange of methodology across all ocean communities, accompanied by rich metadata relevant to prevailing global initiatives and goals. Further, it aims to support communities in managing the evolution and convergence of their methods, standards, and related content stored in any digital media. The OBPS has built its digital capabilities – centred on regularised document and metadata stores alongside natural language processing and semantic tagging systems – upon input from community consultations, focus groups, thematic task teams, and collaborations with projects and a growing suite of UN Ocean Decade Actions. The OBPS multiplies the value of its holdings by participating in the Ocean Data and Information System (ODIS), projecting its holdings via the ODIS Architecture (ODIS-Arch) to platforms such as the IOC’s Ocean InfoHub. Many more opportunities are emerging (e.g. building interoperable methodology management systems and crowdsourcing), and the OBPS is developing new capacities to support foundational and highly advanced initiatives. In addition to new technology, these capacities also require socio-culturally tuned user experiences, an appreciation of intersecting value systems, intellectual property management, and mechanisms to address the legal complexities of global data sharing. This presentation will summarise the current digital capacities of the OBPS, but focus on how we plan to expand these to meet the needs expressed by our growing community of users and co- developers.

3.2       Expanding the pool: new partnerships (private sector, other digital stakeholder groups

16:26 – 16:35
[75]Partnering with Stakeholders: The Moana Project’s Te Tiro Moana and Mangōpare Sensor Programme
Julie Jakoboski
, John Radford, João Marcos Azevedo Correia de Souza, Malene Felsing and Moninya Roughan

[abstract] The Moana Project is an interdisciplinary, New Zealand-wide oceanographic research programme focusing on improving our understanding of New Zealand’s oceans in order to support a Blue Economy. The Te Tiro Moana work stream (“Eyes on the Ocean”) focuses on identifying, obtaining, and liberating physical oceanographic observations through collaboration with New Zealand-wide stakeholders, which represent a wide range of organisations and individuals. We have collaborated with and obtained existing oceanographic observations from a diverse group of iwi, government, academic, industry, commercial, and private organisations that have connections to the ocean. After processing, we provide the datasets to the publicly accessible New Zealand Ocean Data Network (www.nzodn.nz), where possible, and/or via api. To improve the Moana Project’s ocean forecasts and hindcasts, better observational coverage of the coastal regions is necessary. A low-cost, temperature and depth sensor, called Mangōpare, was developed by Zebra-Tech, Ltd and is currently installed on commercial fishing vessels New Zealand-wide, with research and educational vessels also involved in the programme. The sensor, sensor offload, and data pathway are fully automatic, from sensor to hydrodynamic ocean model. This data pathway includes data transmission, processing, quality-control, database ingestion, assimilation in the Moana Project modelling suite, and the provision of processed measurements back to the vessel that obtained them. Ocean forecasts and hindcasts are made available via multiple platforms, tailored to the feedback of stakeholders and sensor programme participants. This allows stakeholders beyond the scientific research community to utilise, understand, and inform decisions using the oceanographic modelling suite, directly improved via assimilation of observations from a diverse range of ocean stakeholders. Programme success is dependent on stakeholder (commercial fisher, government, iwi, academic, industry, private, research, non-profit, community) involvement in obtaining oceanographic observations and utilising the resulting products, providing a critical connection between sectors in ocean research, sustainability, and policy.

16:35 – 16:44
[76]Implementation of the Salvamares Program and the creation of the PESCADATA-SNP of the Peruvian marine ecosystem.
Salvador Peraltilla
, Erika Meneses Yance, Anibal Aliaga Rosales and Jorge Risi Mussio

[abstract] The National Fisheries Society (SNP) of Peru compromise and aware of the environmental problems, such as the high environmental variability of the Peruvian marine ecosystem that alters the behavior of marine resources, the effects of climate change such as greenhouse gases and marine pollution, the incidental and accidental capture of marine fauna that interacts with fishing, and the scarcity of scientific information available and updated on the knowledge of the Peruvian marine ecosystem, has created in 2017 Salvamares program. This program aims to contribute to the sustainability of the marine ecosystem by monitoring and releasing the main species that interact with the fishing industry, Thus, allowing to secure and protect it as well as to alert on the effects of climate change on an ongoing basis, also developing a large database of the Peruvian marine ecosystem. The information system or database collected by Salvamares program is called PESCADATA-SNP, which consists of data collection using fishing boats during fishing seasons, in fishing prospecting carried out in support of Peruvian Marine Research Institute (IMARPE) activities, in Eureka operations, among others. The information management, involving the collection, processing and analysis of acoustic, biological and oceanographic information, is in charge of specialists who integrate the Scientific Research Committee of the SNP. Since 2017, the PESCADATA-SNP has integrated information composition of catches and biometrics of target and companion species, oceanographic information such as sea temperature in each fishing operation and about the Oxygen Minimum Zone obtained with acoustic information. The results of the program are measured first for the success of the implementation of the Salvamares program, and in the generation of PESCADATA-SNP, contributing to fisheries improvements and management.

16:44-16:53
[77]The Private Sector: A Key Data Partner in Implementing the Ocean Decade
David Millar

[abstract] While the Ocean Decade will not set ocean policy, it will build scientific capacity and generate data that will contribute to the 2030 Agenda for Sustainable Development and other relevant global legal and policy frameworks. The integration of science and policy in a mutually reinforcing manner will be critical to balancing sustainable development and environmental stewardship of the ocean and its resources. Unfortunately, the availability of ocean science data is currently insufficient to inform sustainable ocean governance and policies. This presents an opportunity for the private sector, who are actively involved in the sustainable development of the world’s oceans. The private sector is already collecting ocean science data in support of resource and infrastructure development projects around the globe. These Geo-data support feasibility studies, siting, permitting, design, engineering, construction, operations, and maintenance activities, but are generally not shared or made publicly accessible. As a result, there is a wonderful opportunity to leverage ongoing private sector activities to improve our collective understanding of the world’s oceans. Now more than ever, companies understand that employees, shareholders, customers, and society expect them to contribute to a sustainable future. Businesses must balance short- and long-term stakeholder interests while integrating economic, social, and environmental considerations into decision-making. Given the private sector’s increasing awareness of and focus on ocean stewardship, they represent an important partner in expanding the pool of ocean science data generators and users. This presentation will discuss, with real world examples, the role of the private sector in contributing to a digital ecosystem that supports the Ocean Decade, focusing on how public private partnerships and equitable frameworks can provide public access to private geo-data. It will also introduce the vision for a new private sector Geo-data working group that will help accomplish these goals in support of the Ocean Decade.


16:53-17:02
[78]EMODnet Chemistry: harmonising and consolidating in Europe for a global engagement
Alessandra Giorgetti
, Chiara Altobelli and Dick M.A. Schaap

[abstract] EMODnet Chemistry collects, validates, provides open and free access to over 1 million data sets on eutrophication and marine pollution. Together, 46 national oceanographic data centres, environmental monitoring agencies, and expert institutes are active in its European network, gathering and processing data from 500+ organisations from 32 countries into validated data collections and derived maps. Some significant achievements:
• Assigned to gather data from all EU Member States to monitor marine litter occurrence (beach and sea floor); this also served for computing EU beach litter quantitative baselines and threshold value. The ML scope is expanded with floating micro-litter monitoring data;

• Adopted and upgraded the SeaDataNet CDI data discovery and access service to become fit for chemistry purposes and requirements from MSFD experts;
• Initiated together with Copernicus Marine Environment Monitoring Service (CMEMS) the first release of a joint portfolio of MSFD products; EMODnet Chemistry data collections are contributing to the CMEMS REP-BGC data product.

EMODnet Chemistry is also increasingly invited beyond European boundaries to contribute to the UN 2030 Agenda for Sustainable Development. It was involved by Sulitest NGO in partnership with UN DESA and MOI in developing the Sulitest SDG 14 module for ocean literacy, and by UNEP/MAP for contributing to the Ecosystem Approach Roadmap given its potential support to the 2023 UNEP/MAP Mediterranean Quality Status Report delivery. Other international activities involve participation in the global Marine Litter and Plastic Pollution Ontology peer review process and implementation of the Global Ocean Oxygen Database and ATlas (GO2DAT). Furthermore, contributing to development of the Global Ocean Acidification (SDG14.3.1) Data Portal, participating in the G20 TWG for harmonisation and standardisation of monitoring and data management methods on marine plastics and in the UN Global Partnership on Marine Litter Digital Platform. In 10 years’ time, EMODnet Chemistry has become the European marine chemistry data broker.

3.3       Future proofing of our digital commons towards AI and model-ready data

17:02-17:11
[79]AI-Driven Indonesia’s National Ocean Data Center
Hammam Riza
, Andi Eka Sakya, Marina Frederik and Imam Mudita

[abstract] On 5 December 2017, the UN General Assembly proclaimed the Decade of Ocean Science for Sustainable Development (2021-2030). The development and priority area, among others, are Data and Information System, Ocean Dimension in an Integrated Multi-hazard Warning System, and the Ocean in an Earth System Observation, Research, and Prediction which are aimed at manifesting the 7 (seven) Decade’s societal outcomes underline the qualities of the ocean and of people. Those 7 (seven) societal outcomes are, namely, a clean ocean, a healthy and resilient ocean, a productive ocean, a predicted ocean, a safe ocean, an accessible ocean and an inspiring and engaging ocean. Indonesia as a maritime continent (MC) has unique characteristics of weather-climate variation due to its geographical position, which is covered by the ocean and has deep convection processes. Multi-scale interactions in the atmosphere and ocean have a profound impact on air-sea thermal exchanges, modulating climate variability over a wide range of time scales. Besides, geographically, Indonesia is squeezed by 3 major tectonic plates. The recent tsunamis that swabbed over in Palu and Sunda strait, add into other tsunamis’ potential source of risks so-called atypical one that pointing into the natural intrinsic of Indonesia which is a-ring-of-fire. Indonesia is thus very prone to multi-hazard events, both geologically as well hydro-meteorologically. The un-avoidable climate change and its negative impact, as well as potential occurrence of tsunamis have brought to basic need of ocean data and information. The better understanding of earlier variation and trend will surely back up the better life of ocean communities, as such that is envisioned within A Safe Ocean of the Decade. In support to risk assessment mapping and analysis not only with coastal topographic and bathymetry in case of coastal hazard, but also with purely bio-physico-chemical parameters for the purpose of officiating the health of ocean index, a National Ocean Data Center (NODC) has also been established. The so-called Indonesia NODC (NODC.id) is an integrating all marine environment and ecosystem data and information from all Indonesia institutions. The system has been officially launched during Our Ocean Talk in 2018 in Bali, and open for public since then. In this paper, the establishment of Indonesia National Ocean Data Center Ina-NODC) is outlined, the development of which is based on the newly trend of Artificial Intelligent. Indonesia’s AI platform supports the deployment and implementation of smart ocean applications. The platform’s powerful big data and AI service capabilities will advance the digitalization of ocean management, facilitating bilateral cooperation in marine science and technology. Indonesia’s national smart ocean AI platform will strengthen exchanges between universities and research institutions and align projects for integration of industry and education. It will also help cultivate AI talent for the Fourth Industrial Revolution, and promote the sharing of AI talent and scientific research findings.

3.4       How to include/involve ECOPs and students in data and information management

17:11-17:20
[80]Contribution of Sandwatch to ocean data collection and sharing during the Ocean Decade
Sachooda Ragoonaden

[abstract] Contribution of Sandwatch to ocean data collection and sharing during the ocean decade
Sandwatch is a UNESCO action-oriented programme, which, since 2001 through capacity building, has enabled schools and coastal communities around the world to collect ocean data and information to enhance resilience of beach environments to climate change impacts and promote ocean literacy. One key component is an International database to archive beach data collected. However, as from 31 December 2020, the “Flash” application, which operated it, has stopped being used. A new client software application is being considered to reconfigure the system so that the data could be retrieved and archive new data to be collected.
The objective is to demonstrate the potential of sandwatch, as a citizen science tool, to generate useful data and information for the purpose of a wide range of users to contribute to coastal management and promote the integration of sandwatch in the IODE network
Sandwatch will take advantage of the UN Ocean Decade (2021-2030) of Ocean Science for Sustainable Development to enhance further its activities to enable the generation of good quality operational and research data and develop an appropriate database facilitating transparent and accessible data sharing and stewardship. This data from school and local communities and citizens who are experiencing the changes at a local level will constitute an important data source to complement scientific data.
By integrating sandwatch in IODE, this will facilitate exchange of data between sandwatchers and IODE Community which will be beneficial to all parties. Sandwatch will look forward to participating in the Ocean Data and Information System (ODIS) which IODE intends to develop
Sandwatch is very keen to be part of the IODE network keen and contribute in sharing of ocean data for the benefit of the ocean community and welfare of the people.

17:20-17:26 Q&A
17:30 Adjourn

DAY 3: 16 February 2022

SESSION 4: DELIVERING A TRANSFORMATIVE DATA ECO-SYSTEM FOR THE OCEAN DECADE

4.1. The Ocean Decade Date Vision: data as an integral part of the Ocean Decade Challenges (Presentations and Panel discussion)

09:00-09:30: Setting the scene

  • The Vision of data, information and knowledge management for the Decade – The Ocean Decade Implementation Plan & Data (Julian Barbière, IOC) [remote] – presentation
  • Initial reflections on the envisaged Ocean Data Eco-system (Terry McConnell, IOC) [on- site]- presentation
  • Reflections on relevant points from presentations and discussions in Sessions 1, 2 and 3 (Ute Brönner, Fraunhofer Institute for Computer Graphics Research IGD) [remote]

09:30-10:15: Panel discussion with audience: Data as an integral part of the Ocean Decade Challenges

Moderator: Dick Schaap (MARIS, SEADATANET)

presentation

Panelists: Bjorn A Saetren (FAIRDO), Rishi Sharma (FAO), Ward Appeltans (OBIS), Emma Heslop (GOOS)

  • End-to-end needs for data to address key Decade Challenges  
  • Barriers, challenges and opportunities to achieve the vision towards a global digital ecosystem contributing to addressing the Decade challenges
  • FAIR Data in the Ocean Decade and guiding principles for implementing the vision and measuring success
  • Key components and transformation requirements for the data ecosystem we need for the ocean we want


10:15-10:30 Health break

4.2 Coordinating the co-implementation of the Ocean Decade’s Digital Vision (Presentations and Panel discussions)

10:30-11:00: Introductory presentations

  • The Ocean Decade Coordinating Structure (Terry McConnell, IOC) [on-site]- presentation
  • The Data Coordination Platform for the Decade – a mechanism to achieve the data vision (Kate Wing, Ocean Data Coordination Group Co-Chair, DCG co-chair, Intertidal Agency & Jan-Bart Calewaert, Ocean Data Coordination Group Co-Chair, EMODnet) [on-site]-presentation
  • IODE-IWG Strategy for Ocean Data and information Stewardship (SODIS) for the UN Ocean Decade and other key global data initiatives contributing to the Decade data vision and implementation (Pier Luigi Buttigieg, GEOMAR) [remote]-presentation

11:00-11:45 – Panel discussion with audience on the co-implementation of the Ocean Decade’s Digital Vision and current state of play regarding data in the Ocean Decade

Moderator: Jan-Bart Calewaert (EMODnet)

Panelists: Pier Luigi Buttigieg (GEOMAR, ODIS), John Siddorn (NOC, DITTO), Nadia Pinardi (University of Bologna, COASTPREDICT), Gry Ulverud (C4IR)

  • Perspectives from key Decade Programmes (DITTO, IODE-ODIS, COASTPREDICT)) and reflections from the private sector on the Ocean Decade Data Vision,  their role in achieving this vision and the current state of play with regards to data in the Ocean Decade
  • Coordination with other global data initiatives and partners
  • Ensuring that all communities and stakeholders are consulted, engaged and involved in the strategy development and implementation  
  • Feedback on the Ocean Decade Data Vision and recommendations for the co-design/implementation process from the audience

11:45-12:00 Concluding remarks and wrap up – Next steps for the Ocean Decade Data Coordination Platform and opportunities for ocean (data) community involvement (Terry McConnell, IOC)

SESSION 5: CONFERENCE DECLARATION AND CLOSING

12:00-13:00

  • Summary Session 1 – Adam Leadbetter
  • Summary Session 2- Simona Simoncelli
  • Summary Session 3- Juliet Hermes
  • Conference Declaration
  • Closing

13:00-…: Lunch