Project Description
This project focuses on deep learning algorithms to model human data such as human images/video, 3D skeletal motion, 3D body/facial surfaces for recognition, prediction and reconstruction.
The Project: With the recent advancements in deep learning and machine learning frameworks using artificial neural network, computers have become smarter than ever in understanding images, video and 3D data. In this project, you will research on state-of-the-art deep learning algorithms to model human data, which is core to many computer vision research. Such data includes human images/video, 3D skeletal motion, 3D body/facial surfaces, etc. Effective modelling of the data enables applications such as action/activity recognition, motion prediction, behaviour/emotion analysis, as well as 2D to 3D reconstruction. The detailed project direction will be developed during the initial stage of your PhD study according to your experience and strength.
Related past research from our team includes:
- High-Speed Multi-Person Pose Estimation with Deep Feature Transfer, Computer Vision and Image Understanding http://hubertshum.com/pbl_cviu2020pose.htm
- A Quadruple Diffusion Convolutional Recurrent Network for Human Motion Prediction, IEEE Transactions on Circuits and Systems for Video Technology http://hubertshum.com/pbl_tcsvt2021prediction.htm
- A Unified Deep Metric Representation for Mesh Saliency Detection and Non-rigid Shape Matching, IEEE Transactions on Multimedia http://hubertshum.com/pbl_tmm2020mesh.htm
Supervision: As a PhD student, you will be supervised by Dr Hubert Shum (http://hubertshum.com/), who is an Associate Professor in Computer Science at Durham University. He has published over 100 research papers in the fields of computer vision, computer graphics, motion analysis and machine learning. He has led funded research projects awarded by the UK Research Council, the Ministry of Defence and the Royal Society. This has facilitated him to supervised 23 PhDs and 6 Post-doctoral Researchers. Engaging both the academic and the industry, he hosted international conferences such as BMVC and the ACM SIGGRAPH Conference on MIG, as well as served as an Associate Editor of CGF and a Guest Editor of IJCV.
During the PhD study, you will receive comprehensive training and research coaching through regular one-to-one meetings with Dr Shum. Such interactive and tailored support can develop your strength and consolidate your research knowledge. Furthermore, Dr Shum’s research team has a supportive culture with team members from all over the world, which provides assistance and collaboration opportunities to each other. These have facilitated his past PhDs to publish their research in prestige journals (e.g. IEEE TIP, IEEE TVCG and IEEE TMM) and to develop their successful career.
Funding Information
This is a self-funded PhD position.
Eligibility Requirements
- A research interest in computer vision and machine learning
- Knowledge in any modern programming languages
- A relevant undergraduate or master degree with good scores
- Good English https://www.dur.ac.uk/learningandteaching.handbook/1/3/3/1/
Application Process
You are encouraged to send an email with a resume to Dr Hubert Shum [email protected] for initial discussions. More information on http://hubertshum.com/
Formal applications should be done online.
Supplementary Information
The position is based in Durham University, which is ranked the 4th in the UK by the Guardian and top 100 in the world by QS Top Universities. As a member of the elite Russell Group, Durham University focuses on research excellence delivered by world-class academics. It is located at Durham in North East England, which is one of the safest cities in the UK with an affordable living cost. The Department of Computer Science is one of the fastest-growing departments in the University, supported by major investments in staff recruitment and a £40m new academic building. 83% of its research outputs are considered as “world-leading” and “internationally excellent” by the UK Research Excellence Framework.