Abstract

Autism spectrum disorder (ASD) represents a neurodevelopmental condition characterized by difficulties in expressing emotions and communication, particularly during early childhood. Understanding the affective state of children at an early age remains challenging, as conventional assessment methods are often intrusive, subjective, or difficult to apply consistently. This paper builds upon previous work on affective state recognition from children’s drawings by presenting a comparative evaluation of machine learning models for emotion classification. Three deep learning architectures—MobileNet, EfficientNet, and VGG16—are evaluated within a unified experimental framework to analyze classification performance, robustness, and computational efficiency. The models are trained using transfer learning on a dataset of children’s drawings annotated with emotional labels provided by psychological experts. The results highlight important trade-offs between lightweight and deeper architectures when applied to drawing-based affective computing tasks, particularly in mobile and real-time application contexts. [1] American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, 5th ed. Washington, DC, USA: APA, 2013. [2] H. Tager-Flusberg, R. Paul, and C. Lord, “Language and communication in autism,” in Handbook of Autism and Pervasive Developmental Disorders, 3rd ed., D. J. Cohen and F. R. Volkmar, Eds. Hoboken, NJ, USA: Wiley, 2005, pp. 335–364. [3] G. Dawson et al., “Early behavioral intervention is associated with normalized brain activity in young children with autism,” J. Am. Acad. Child Adolesc. Psychiatry, vol. 51, no. 11, pp. 1150–1159, 2012. [4] C. Lord et al., “Autism diagnostic observation schedule,” J. Autism Dev. Disord., vol. 30, no. 3, pp. 205–223, 2000. [5] R. W. Picard, Affective Computing. Cambridge, MA, USA: MIT Press, 1997. [6] C. A. Malchiodi, Understanding Children’s Drawings. New York, NY, USA: Guilford Press, 1998. [7] E. M. Koppitz, Psychological Evaluation of Children’s Human Figure Drawings. New York, NY, USA: Grune & Stratton, 1968. [8] A.-L. Popescu and N. Popescu, “Neural networks - based solutions for predicting the affective state of children with autism,” 23rd International Conference on Control Systems and Computer Science (CSCS), Bucharest, Romania, pp. 2-5, 2021, doi: 10.1109/CSCS52396.2021.00023 [9] A. L. Popescu and N. Popescu, "Machine Learning based Solution for Predicting the Affective State of Children with Autism," 2020 International Conference on e-Health and Bioengineering (EHB), Iasi, Romania, pp. 1-4, 2020, doi: 10.1109/EHB50910.2020.9280194. [10] Popescu, A.-L.; Popescu, N.; Dobre, C.; Apostol, E.-S.; Popescu, D., “IoT and AI-Based Application for Automatic Interpretation of the Affective State of Children Diagnosed with Autism”, Sensors 2022, 22, 2528, pp. 1-7, 2022, https://doi.org/10.3390/s22072528 [11] Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May 2015. [12] R. A. Calvo and S. D’Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affective Comput., vol. 1, no. 1, pp. 18–37, Jan.–Jun. 2010. [13] Z. Zhang, P. Cui, and W. Zhu, “Deep learning for affective computing: A survey,” IEEE Trans. Affective Comput., vol. 9, no. 3, pp. 1–20, Jul.–Sep. 2018. [14] A. G. Howard et al., “MobileNets: Efficient convolutional neural networks for mobile vision applications,” arXiv:1704.04861, Apr. 2017. [15] M. Tan and Q. Le, “EfficientNet: Rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Mach. Learn. (ICML), Long Beach, CA, USA, Jun. 2019, pp. 6105–6114. [16] K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” in Proc. Int. Conf. Learn. Representations (ICLR), San Diego, CA, USA, May 2015. [17] Keras Applications: MobileNet, https://keras.io/api/applications/mobilenet/, accessed on 02.12.2025. [18] Keras Applications: EfficientNet, https://keras.io/api/applications/efficientnet/, accessed on 02.12.2025. [19] PyTorch Vision VGG16 Model, https://docs.pytorch.org/vision/main/models/generated/torchvision.models.vgg16.html, accessed on 03.12.2025. [20] About ImageNet, https://www.image-net.org/ , accessed on 20.12.2025. [21] About PandaSays, https://www.pandasays.org/, accessed on 10.11.2025.

Keywords

  • Affective computing
  • autism spectrum disorder
  • childrens drawings
  • deep learning
  • emotion recognition
  • machine learning
  • transfer learning

References

  1. American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, 5th ed. Washington, DC, USA: APA, 2013.
  2. H. Tager-Flusberg, R. Paul, and C. Lord, “Language and communication in autism,” in Handbook of Autism and Pervasive Developmental Disorders, 3rd ed., D. J. Cohen and F. R. Volkmar, Eds. Hoboken, NJ, USA: Wiley, 2005, pp. 335–364.
  3. G. Dawson et al., “Early behavioral intervention is associated with normalized brain activity in young children with autism,” J. Am. Acad. Child Adolesc. Psychiatry, vol. 51, no. 11, pp. 1150–1159, 2012.
  4. C. Lord et al., “Autism diagnostic observation schedule,” J. Autism Dev. Disord., vol. 30, no. 3, pp. 205–223, 2000.
  5. R. W. Picard, Affective Computing. Cambridge, MA, USA: MIT Press, 1997.
  6. C. A. Malchiodi, Understanding Children’s Drawings. New York, NY, USA: Guilford Press, 1998.
  7. E. M. Koppitz, Psychological Evaluation of Children’s Human Figure Drawings. New York, NY, USA: Grune & Stratton, 1968.
  8. A.-L. Popescu and N. Popescu, “Neural networks - based solutions for predicting the affective state of children with autism,” 23rd International Conference on Control Systems and Computer Science (CSCS), Bucharest, Romania, pp. 2-5, 2021, doi: 10.1109/CSCS52396.2021.00023
  9. A. L. Popescu and N. Popescu, "Machine Learning based Solution for Predicting the Affective State of Children with Autism," 2020 International Conference on e-Health and Bioengineering (EHB), Iasi, Romania, pp. 1-4, 2020, doi: 10.1109/EHB50910.2020.9280194.
  10. Popescu, A.-L.; Popescu, N.; Dobre, C.; Apostol, E.-S.; Popescu, D., “IoT and AI-Based Application for Automatic Interpretation of the Affective State of Children Diagnosed with Autism”, Sensors 2022, 22, 2528, pp. 1-7, 2022, https://doi.org/10.3390/s22072528
  11. Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, May 2015.
  12. R. A. Calvo and S. D’Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affective Comput., vol. 1, no. 1, pp. 18–37, Jan.–Jun. 2010.
  13. Z. Zhang, P. Cui, and W. Zhu, “Deep learning for affective computing: A survey,” IEEE Trans. Affective Comput., vol. 9, no. 3, pp. 1–20, Jul.–Sep. 2018.
  14. A. G. Howard et al., “MobileNets: Efficient convolutional neural networks for mobile vision applications,” arXiv:1704.04861, Apr. 2017.
  15. M. Tan and Q. Le, “EfficientNet: Rethinking model scaling for convolutional neural networks,” in Proc. Int. Conf. Mach. Learn. (ICML), Long Beach, CA, USA, Jun. 2019, pp. 6105–6114.
  16. K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” in Proc. Int. Conf. Learn. Representations (ICLR), San Diego, CA, USA, May 2015.
  17. Keras Applications: MobileNet, https://keras.io/api/applications/mobilenet/, accessed on 02.12.2025.
  18. Keras Applications: EfficientNet, https://keras.io/api/applications/efficientnet/, accessed on 02.12.2025.
  19. PyTorch Vision VGG16 Model, https://docs.pytorch.org/vision/main/models/generated/torchvision.models.vgg16.html, accessed on 03.12.2025.
  20. About ImageNet, https://www.image-net.org/ , accessed on 20.12.2025.
  21. About PandaSays, https://www.pandasays.org/, accessed on 10.11.2025.