PhD opportunity - 'Multi-source remote sensing for enhanced flood modelling' jointly funded by Newcastle University and James Hutton Institute.
Click here for more details.
Deadline 5th January 2018
Flooding is a major societal challenge with significant direct and indirect impacts. Hydrodynamic models are important for accurately modelling floods and understanding adaptations required to improve resilience. These models require topographic data defining the channel and floodplain. Currently, this is assembled through relatively sparse measurements from cross-sections and walk-over surveys. However, emerging remote sensing techniques are of increasing relevance and offer a non-contact means of deriving detailed topography and other key variables related to hydromorphological characterisation (e.g. pool-riffle sequences, gravel bars, riparian vegetation). Unmanned aerial vehicles (UAVs or drones), in combination with compact digital cameras, can deliver high resolution digital elevation models (DEMs) and orthoimagery, which offer a flexible and low-cost approach for reach-scale characterisation. Furthermore, recent developments in airborne laser scanning (lidar) enable remote measurement of river bathymetry and water depth, with huge potential for seamless mapping of fluvial topography. However, there remain significant challenges in intelligent extraction of relevant variables, requiring development of enhanced segmentation algorithms and adoption of big data analytics approaches. This project will collect UAV imagery at an existing test site, and integrate this with bathymetric lidar for reach-scale characterisation of key variables for flood modelling, leading to the following objectives:
1. Assemble multi-source remote sensing datasets of channel and floodplain topography;
2. Develop novel, big data approaches for intelligent and automated extraction of key variables;
3. Integrate derived variables into an existing hydrodynamic model;
4. Validate the approach through application to a monitored test site environment.