Skip to main content Start main content

Exploring Decentralized Artificial Intelligence: Advancing the Democratization of GenAI

30 Nov 2024


With the rapid development of Generative AI (GenAI) technologies—such as Large Language Models (LLMs), Multimodal Large Language Models (MLLMs), and Stable Diffusion—AI is increasingly permeating and transforming various industries, including life sciences, energy, finance, and entertainment. These technological breakthroughs not only accelerate innovation and enable personalized services but also significantly improve the efficiency of workflows. According to market forecasts, the global GenAI market is expected to grow from USD 40 billion in 2022 to USD 1.3 trillion over the next decade.

 

Challenges to Widespread Adoption and Strategic Countermeasures

Despite its promise, the widespread adoption of GenAI faces substantial challenges. One of the most pressing issues is the concentration of GPU resources among major technology firms, which restricts the capabilities of research institutions and enterprises in developing their own models. Many organizations are forced to rely on API-based solutions, which not only introduce latency and security risks but also limit the customizability of models. Although open-source models offer some flexibility, they are often not sufficiently adaptable to domain-specific knowledge, hindering deep engagement by researchers in the pretraining phase, a critical stage for creating powerful and domain-aligned models.

 

In response, The Hong Kong Polytechnic University is pioneering an innovative GenAI infrastructure that enables enterprises and applications to independently pretrain their own GenAI models. This is achieved through a novel "Model over Models" (MoM) methodology to build foundation models. Specifically, global knowledge is divided into thousands of domains, with relatively lightweight Small Language Models (SLMs) trained for each. These smaller models demand far fewer resources—e.g., a 7-billion-parameter model can be continually pretrained using just 64 to 128 GPUs. Eventually, these SLMs can be integrated via the MoM framework to construct affordable and scalable Artificial General Intelligence (AGI) models, significantly lowering barriers to entry and enabling global participation in foundation model development.

 

For more details, please visit: https://www.stheadline.com/knowledge/3406043/



Your browser is not the latest version. If you continue to browse our website, Some pages may not function properly.

You are recommended to upgrade to a newer version or switch to a different browser. A list of the web browsers that we support can be found here