FindAPhD Weekly PhD Newsletter | JOIN NOW FindAPhD Weekly PhD Newsletter | JOIN NOW

Promoting Interpretability and Transparency in AI-assisted Graphic Design

   Centre for Accountable, Responsible and Transparent AI

This project is no longer listed on and may not be available.

Click here to search for PhD studentship opportunities
  Dr Nanxuan (Cherry) Zhao  Applications accepted all year round  Competition Funded PhD Project (UK Students Only)

About the Project

Graphic designs spread everywhere in our daily lives, including presentation slides, posters, advertisements, websites, and so on. They build up essential communication tools for effectively expressing our ideas. Taking many design factors such as color, image, icon, font into consideration leads to a huge exploration space with hundreds and thousands of choices. Thus, creating graphic design is not an easy task, especially for novices without any design experience. Recently, with the power of deep learning and rich computational resources, AI is increasingly being used in graphic design assistant tools. However, existing tools only provide final decisions in a suggestion list without any further explanation or visualization. Users, especially novices, often fully trust the suggested results, giving the AI methods enormous responsibilities. Since results generated by AI may be below expectation and fail to meet the design standard, it is necessary to make the developed AI algorithms interpretable and transparent with self-explainable results. Besides, another induced ethical issue on AI-assisted graphic design is copyright. How to avoid copyright infringement to protect the basic right of creators and designers is also important. In this project, we will investigate interpretability methods for AI-assisted graphic design tools by using advanced visualization and machine learning techniques. The project aims to create more trustworthy and ethical AI for contributing to graphic design and even broader creative art areas.

This project is associated with the UKRI Centre for Doctoral Training (CDT) in Accountable, Responsible and Transparent AI (ART-AI). We value people from different life experiences with a passion for research. The CDT's mission is to graduate diverse specialists with perspectives who can go out in the world and make a difference.

Applicants should hold, or expect to receive, a First or Upper Second Class Honours degree in a relevant subject. A master’s level qualification would also be advantageous. Applicants should have taken a mathematics unit or a quantitative methods course at university or have at least grade B in A level maths or international equivalent. Desirable qualities in candidates include intellectual curiosity and experience of machine learning and deep learning.

Informal enquires about the research or project should be directed to Dr Zhao.

Formal applications should be accompanied by a research proposal and made via the University of Bath’s online application form. Enquiries about the application process should be sent to [Email Address Removed].

Start date: 3 October 2022.

Funding Notes

ART-AI CDT studentships are available on a competition basis and applicants are advised to apply early as offers are made from January onwards. Funding will cover tuition fees and maintenance at the UKRI doctoral stipend rate (£16,062 per annum in 2022/23, increased annually in line with the GDP deflator) for up to 4 years.
We also welcome applications from candidates who can source their own funding.


Zhao, Nanxuan, et al. "Selective Region-based Photo Color Adjustment for Graphic Designs." ACM Transactions on Graphics (TOG) 40.2 (2021): 1-16.
Search Suggestions
Search suggestions

Based on your current searches we recommend the following search filters.

PhD saved successfully
View saved PhDs