Don't miss our weekly PhD newsletter | Sign up now Don't miss our weekly PhD newsletter | Sign up now

  (2025-OU14) Developing new method of botanical survey using haptic technology


   School of Environment, Earth & Ecosystem Sciences

This project is no longer listed on FindAPhD.com and may not be available.

Click here to search FindAPhD.com for PhD studentship opportunities
  Dr Irina Tatarenko  No more applications being accepted  Funded PhD Project (Students Worldwide)

About the Project

Botanical surveys provide key information in many environmental habitat assessments. A method of measuring plant species in the communities has been developed a hundred years ago along with establishment of plant ecology as a science (e.g., Warming, 1909). Plant species abundance is assessed by visual estimates the percent of a defined-size quadrat covered by vertical projection of the above-ground parts of the species. Visual estimates are highly subjective, depending on experience and personality of the surveyors and their plant identification skills (Kennedy & Addison, 1987; Morrison, 2016). Criticism has been addressed in development some instrumental methods for cover measurements (e.g., Wilsson, 2011) however, visual estimations remain widely used due to its flexibility and speed of survey. Photographic technics (e.g., Salas-Aguilar et al., 2017) are challenging in application to complex species-rich vegetation types like floodplain meadows, which will be in focus of this project.  

Plant species identification based on visual morphological characters. However, tactile feelings, which are also differential between species, have never been identified for ID purposes because of ambiguity in descriptions. Haptic technology can allow assessment of plants leaves and stem tactual topically using a very fine resolution, this can improve accuracy in species identification in real-time. Once recorded the plant textures can be stored within a larger battery of textures to be used as future identification aids. The recording of tactual plant textures can also be used to identify the health of the plant this could inform future teaching and learning activities. 

Training AI in plant surfaces identification could be a vital element of connecting the plant identification data and delineating pertinent bio markers for better growth of the plant. Limits of 3D scans have never been tested for assessment of multi-layer multi-species meadow vegetation, which can be a challenging task. Project aims to develop a haptic tool which will help species identification by assessment of their surfaces and also, by 3D scan of vegetation in depth.  

Project highlights

  • The project attempts to challenge a classical method of botanical surveys, which have been carried out by human visual estimates for hundred years. 
  • Haptic technology is a rapid developing area, that allows application of computing power and artificial intelligence into the areas where tactile sensing can be a key. 
  • Using 3D/4D XR capturing technology and pairing it with haptic tactual explorations to aid the creation of a comprehensive tactual battery of botanical textures to be used more readily for education and future research. 
  • Training AI into identification of plant leaves and stems surfaces and developing a haptic sensor which would be able to assess plant surfaces – are two first steps towards developing a novel haptic tool for the survey of complex species-rich vegetation. 

Methodology

Classic botanical methodology will be applied to identify the plant species. Visual estimation of abundance of different species in plant communities would be done by standard methods on 1 x 1 m plots in species-rich floodplain meadow communities. 

After the 3D capture, the methodology applied will be tactual Feature Extraction, using haptic technology to traverse 3D plant models and to extract critical tactual information, which will then be used as tactual data. Once individual tactual data has been captured and recorded, haptic tooling can then be utilised to identify specific textures of rare botanicals structures and plants which may be visually identical, apart from the textural differencing. AI and haptics could be used to speed up tactual identification and for ease of use of the larger tactile library. AI training will be based on high resolution scans of herbarium specimen as well as living leaves and stems of meadow species. 

Training and skills

DRs will be awarded CENTA Training Credits (CTCs) for participation in CENTA-provided and ‘free choice’ external training. One CTC can be earned per 3 hours training, and DRs must accrue 100 CTCs across the three and a half years of their PhD.  

Full training on botanical survey and species identification would be provided. Developing understanding of plant morphology: 1) Introduction to plant morphology, 2) Use of ID keys, 3) Building up a portfolio of plant ID skills, 4) Practicing plant ID.   

Haptic technology: full training will become part of the students’ initial training. Students will be trained in the following 1) human haptics, understanding physics of force and how/why the body uses haptics (touch sense), 2) The history of haptics and the underpinning of telepresence, 3) test cases and use cases of haptics with appropriate focus on keynote literature that exemplifies haptics in a user-led approach. 4) The role of collisions and a spatial understanding within virtual space.  

Partners and collaboration

CASE partner – company HAPLY are fast becoming a focus company for haptics and many competitors are watching their progress with reaching a pivotal point for greater use of haptics. 

UKCEH – UK leading organisation for monitoring various aspects of environment including botanical surveys. Dr Markus Wagner as one of the leading botanists at CEH, will ensure that AI training in plant morphology as well as standards of botanical surveys are met in work on this PhD project. 

Further details

For any enquiries related to this project please contact Dr Irina Tatarenko Open University, Faculty of STEM, [Email Address Removed].  

To apply to this project:  

 Applications must be submitted by 23:59 GMT on Wednesday 8th January 2025.  

Biological Sciences (4) Computer Science (8) Environmental Sciences (13)

References

Basdogan, C., Giraud, F., Levesque, V. and Choi, S. (2020). ‘A review of surface haptics: Enabling tactile effects on touch surfaces’, IEEE transactions on haptics, 13(3), pp.450-470.
Bau, O. and Poupyrev, I. (2012). REVEL: tactile feedback technology for augmented reality. ACM Transactions on Graphics (TOG), 31(4), pp.1-11.
Culbertson, H., Schorr, S.B. and Okamura, A.M. (2018). ‘Haptics: The present and future of artificial touch sensation’, Annual review of control, robotics, and autonomous systems, 1(1), pp. 385-409.
Kennedy, K.A., Addison, P.A. (1987). ‘Some considerations for the use of visual estimates of plant cover in biomonitoring’, Journal of Ecology, 75(1), pp. 151-157.
Morrison, L. W. (2016). ‘Observer error in vegetation surveys: a review’, Journal of Plant Ecology, 9(4), pp. 367-379, https://doi.org/10.1093/jpe/rtv077
Salas-Aguilar, V., Sánchez-Sánchez, C., Rojas-García, F., Paz-Pellat, F., Valdez-Lazalde, J. R., Pinedo-Alvarez, C. (2017) ‘Estimation of Vegetation Cover Using Digital Photography in a Regional Survey of Central Mexico’, Forests, 8(10), 392; https://doi.org/10.3390/f8100392
See, A.R., Choco, J.A.G. and Chandramohan, K., 2022. Touch, texture and haptic feedback: a review on how we feel the world around us. Applied Sciences, 12(9), p.4686.
Warming, E. (1909). Oecology of plants: an introduction to the study of plant-communities. Clarendon Press. https://doi.org/10.5962/bhl.title.23133
Wee, C., Yap, K.M. and Lim, W.N., 2021. Haptic interfaces for virtual reality: Challenges and research directions. IEEE access, 9, pp.112145-11216
Wilson, J.B. (2011). ‘Cover plus: Ways of measuring plant canopies and the terms used for them’, Journal of Vegetation Science, 22, pp. 197-206. 10.1111/j.1654-1103.2010.01238.x.

 About the Project