Acoustic data processing and ecological analysis services
We provide specialist processing and analysis of acoustic datasets, alongside broader ecological and environmental data analysis where required. Services are focused on turning raw data into clear, usable outputs – from feature extraction and interpretation through to modelling, reporting, and system development. Work is tailored to the dataset, survey design, and project objectives, with a focus on practical delivery and reliable results.
Projects typically begin with a short review of the dataset and objectives. From there, a clear processing approach is defined, followed by analysis, delivery of outputs, and optional iteration or system development. Work can range from one-off datasets to ongoing support for monitoring programs or survey workflows.
NEKTOFORM — acoustic feature extraction service
NEKTOFORM is a web-based analysis service designed for rapid extraction and review of acoustic features such as fish schools, krill swarms, and scattering layers. It enables fast, interactive processing of survey-scale echosounder data, handling real-world noise and data quality issues while supporting transparent review and downstream analysis. The platform provides a consistent workflow for extracting, inspecting, and refining features, allowing users to explore parameter space, assess outputs, and generate analysis-ready datasets within a single environment.
Best used for:
- rapid feature extraction from large datasets
- exploratory analysis and validation
- preparing inputs for ecological or modelling workflows
Outputs include:
- extracted features (schools, swarms, layers)
- feature-level metrics and summaries
- quality indicators and filtering diagnostics
Acoustic data processing
Processing of single-beam, multibeam, imaging sonar, and broadband echosounder datasets into structured, analysis-ready outputs. Work includes noise removal and data cleaning, gridding and standardisation, feature extraction, and calculation of summary statistics and biomass. Processing approaches are adapted to the dataset and survey conditions to ensure robust and interpretable results. This also includes single-target detection, seabed analysis, and integration with environmental or survey metadata where required.
Typical outputs:
- cleaned and processed datasets
- biomass estimates and summary statistics
- spatial distributions and gridded products
- feature-level outputs (schools, swarms, layers)
Applications:
- fisheries surveys and stock assessment
- seabed mapping and habitat analysis
- environmental monitoring
Feature detection and ecological interpretation
Extraction and analysis of biological features such as fish schools, krill swarms, and scattering layers, with a focus on their structure, behaviour, and environmental context. This includes characterising feature morphology, scattering properties, vertical distribution, and temporal dynamics, and linking these observations to environmental drivers and survey conditions. Outputs are used to support ecological interpretation, stock assessment, and analysis of spatial and temporal patterns across marine and freshwater systems.
Typical outputs:
- feature-level metrics and classifications
- behavioural and distributional summaries
- environmental associations and drivers
- inputs to ecological and stock assessment models
Applications:
- identifying structure and organisation of biological aggregations
- comparing feature characteristics across surveys, sites, or seasons
- understanding diel behaviour and vertical migration patterns
- supporting interpretation of ecosystem dynamics and trophic interactions
Scalable workflows and systems
Development of end-to-end processing pipelines that take acoustic data from raw input through to model outputs, reporting, and decision-ready products. This includes automated workflows, efficient handling of large datasets, and integration with external data sources. Systems can be delivered as interactive dashboards, analysis tools, or secure web services (including Azure-based deployments). The focus is on reproducible, efficient workflows that support ongoing monitoring, large-scale analysis, and consistent data handling.
Typical outputs:
- automated processing pipelines
- dashboards and reporting tools
- model-ready datasets and outputs
- integrated data systems linking multiple sources
Applications:
- standardising processing across multiple surveys or data sources
- reducing manual effort in large or repeated analysis workflows
- enabling consistent reporting across sites, years, or programs
- supporting operational monitoring and decision-making systems
Have data that needs processing or analysis?
Send a short description of your dataset or project and I’ll get back to you quickly. I can review your data, advise on approach, and outline a clear path to usable outputs.