A Comprehensive Review of Deep Learning: Architectures, Recent Advances, and Applications

Ibomoiye Domor Mienye, Theo G. Swart

Research output: Contribution to journalArticlepeer-review

Abstract

Deep learning (DL) has become a core component of modern artificial intelligence (AI), driving significant advancements across diverse fields by facilitating the analysis of complex systems, from protein folding in biology to molecular discovery in chemistry and particle interactions in physics. However, the field of deep learning is constantly evolving, with recent innovations in both architectures and applications. Therefore, this paper provides a comprehensive review of recent DL advances, covering the evolution and applications of foundational models like convolutional neural networks (CNNs) and Recurrent Neural Networks (RNNs), as well as recent architectures such as transformers, generative adversarial networks (GANs), capsule networks, and graph neural networks (GNNs). Additionally, the paper discusses novel training techniques, including self-supervised learning, federated learning, and deep reinforcement learning, which further enhance the capabilities of deep learning models. By synthesizing recent developments and identifying current challenges, this paper provides insights into the state of the art and future directions of DL research, offering valuable guidance for both researchers and industry experts.

Original languageEnglish
Article number755
JournalInformation (Switzerland)
Volume15
Issue number12
DOIs
Publication statusPublished - Dec 2024

Keywords

  • deep learning
  • GAN
  • GRU
  • LLM
  • LSTM
  • machine learning
  • NLP
  • transformers

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'A Comprehensive Review of Deep Learning: Architectures, Recent Advances, and Applications'. Together they form a unique fingerprint.

Cite this