TY - GEN
T1 - Network Architecture for Value Enhanced LLM Use Considering Volunteer Computing Paradigm
AU - Periola, Ayodele A.
AU - Alonge, Akinunde A.
AU - Ogudo, Kingsley A.
N1 - Publisher Copyright:
© 2025 Copyright held by the owner/author(s).
PY - 2025/11/25
Y1 - 2025/11/25
N2 - The adoption of large language model-based applications has been recognized to be the future of the adoption of artificial intelligence. However, large language model (LLM) based applications benefit from uncompensated volunteer input. The uncompensated volunteer input arises from the core service tasks such as output preference indication from a large number of subscribers. Given that the large-scale adoption of LLMs is poised to trigger job loss, the challenge arising from the uncompensated volunteer input should be addressed. This is especially important for the case of developing countries. The presented research proposes a network architecture with cost consensus capacity in this regard. In the presented research, cost consensus capacity refers to the consideration of the LLM query heterogeneity and varying usefulness in the future for other LLM system subscribers in determining the cost associated with the volunteer’s work. The proposition of the network architecture recognizes, addresses and ensures an avoidance of an artificial intelligence winter. In this case, a cost reduction is applied to the case of compensation of volunteer work input. In the proposed approach, the compensation is not given to the volunteer but the sovereignty authority in the volunteer location. The application of a cost reduction factor to prevent artificial intelligence winter reduces costs by an average of (45.5-78.2)%.
AB - The adoption of large language model-based applications has been recognized to be the future of the adoption of artificial intelligence. However, large language model (LLM) based applications benefit from uncompensated volunteer input. The uncompensated volunteer input arises from the core service tasks such as output preference indication from a large number of subscribers. Given that the large-scale adoption of LLMs is poised to trigger job loss, the challenge arising from the uncompensated volunteer input should be addressed. This is especially important for the case of developing countries. The presented research proposes a network architecture with cost consensus capacity in this regard. In the presented research, cost consensus capacity refers to the consideration of the LLM query heterogeneity and varying usefulness in the future for other LLM system subscribers in determining the cost associated with the volunteer’s work. The proposition of the network architecture recognizes, addresses and ensures an avoidance of an artificial intelligence winter. In this case, a cost reduction is applied to the case of compensation of volunteer work input. In the proposed approach, the compensation is not given to the volunteer but the sovereignty authority in the volunteer location. The application of a cost reduction factor to prevent artificial intelligence winter reduces costs by an average of (45.5-78.2)%.
KW - Artificial Intelligence
KW - Computing Task Volunteers
KW - Intelligent Networks
KW - LLMs
UR - https://www.scopus.com/pages/publications/105022512167
U2 - 10.1145/3759023.3759092
DO - 10.1145/3759023.3759092
M3 - Conference contribution
AN - SCOPUS:105022512167
T3 - icABCD 2025 - Proceedings of the 2025 International Conference on Ai, Big Data, Computing and Data Communication Systems
BT - icABCD 2025 - Proceedings of the 2025 International Conference on Ai, Big Data, Computing and Data Communication Systems
A2 - Watson, Bruce
A2 - Singh, Upasana
A2 - Pudaruth, Sameerchand
PB - Association for Computing Machinery, Inc
T2 - 2025 International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems, icABCD
Y2 - 26 November 2025 through 27 November 2025
ER -