WebMorph is a web-based version of Psychomorph, with several additional functions. While WebMorph is optimized for averaging and transforming faces, you can delineate and average any type of image. WebMorph also has several batch functions for processing large numbers of images automatically and can even create moving gifs of your transforms.

WebMorph is made possible by the kind help of Bernie Tiddeman, who developed and maintains the desktop version of Psychomorph. WebMorph uses the open-source Java library FaceMorphLib and is developed and maintained by Lisa DeBruine. Much of the development of WebMorph, especially the 3D functions, was funded by ERC grant #647910 KINSHIP.

WebMorph is currently in beta testing and is likely to remain so for some time. This means that there will be bugs and you cannot rely on the website being functional 100% of the time. Lisa will try to fix any problems as fast as possible, but she is the only person working on this project, so please be patient. If you’re curious about the code or want to help with development, this project is open source at GitHub.

0.1 Bugs and Suggestions

If you spot any errors in WebMorph or this manual, or have a suggestion, please add it to the list of issues.

0.2 Citations

Lisa DeBruine. (2018). Webmorph (Beta Release 2). Zenodo. doi: 10.5281/zenodo.1073696

To cite the morphing and transforming methods, see Bernie Tiddeman’s webpage.

The symmetric image scrambling methods were first published in Conway et al. (2008).

The built-in image sets (DeBruine and Jones 2017) are available with a CC-BY 4.0 license.

0.3 Ethical Issues

We are committed to ethical face research. This means:

  1. Make sure that the use of face photographs respects participant consent and personal data privacy. Images that are “freely” available on the internet are a grey area and the ethical issues should be carefully considered and approved by the relevant ethics board.

  2. Do not use face images in research where there is a possibility of real-world consequences for the pictured individuals. For example, do not post identifiable images of real people on real dating sites without the explicit consent of the pictured individuals for that specific research.

  3. We will never support the use of face image analysis to predict behaviour, or as automatic screening. For example, face images cannot be used to predict criminality or decide who should proceed to the interview stage in a job application. This type of application is unethical because the predictive data is always biased. Face image analysis is useful for researching what aspects of face images give rise to the perception of traits like trustworthiness, but should not be confused with the ability to detect actual behaviour.