Abstract
Deep learning (DL) has become a core component of modern artificial intelligence (AI), driving significant advancements across diverse fields by facilitating the analysis of complex systems, from protein folding in biology to molecular discovery in chemistry and particle interactions in physics. However, the field of deep learning is constantly evolving, with recent innovations in both architectures and applications. Therefore, this paper provides a comprehensive review of recent DL advances, covering the evolution and applications of foundational models like convolutional neural networks (CNNs) and Recurrent Neural Networks (RNNs), as well as recent architectures such as transformers, generative adversarial networks (GANs), capsule networks, and graph neural networks (GNNs). Additionally, the paper discusses novel training techniques, including self-supervised learning, federated learning, and deep reinforcement learning, which further enhance the capabilities of deep learning models. By synthesizing recent developments and identifying current challenges, this paper provides insights into the state of the art and future directions of DL research, offering valuable guidance for both researchers and industry experts.
| Original language | English |
|---|---|
| Article number | 755 |
| Journal | Information (Switzerland) |
| Volume | 15 |
| Issue number | 12 |
| DOIs | |
| Publication status | Published - Dec 2024 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 9 Industry, Innovation, and Infrastructure
Keywords
- GAN
- GRU
- LLM
- LSTM
- NLP
- deep learning
- machine learning
- transformers
ASJC Scopus subject areas
- Information Systems
Fingerprint
Dive into the research topics of 'A Comprehensive Review of Deep Learning: Architectures, Recent Advances, and Applications'. Together they form a unique fingerprint.Press/Media
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver