Beyond 5G networks provide solutions for next‑generation communications, especially digital twins networks (DTNs) have gained increasing popularity for bridging physical and digital space. However, current DTNs pose some challenges, especially when applied to scenarios that require efficient and multimodal data processing. Firstly, current DTNs are limited in communication and computational efficiency, since they require to transmit large amounts of raw data collected from physical sensors, as well as to ensure model synchronization through high‑frequency computation. Second, current models of DTNs are domain‑specific (e.g. E‑health), making it difficult to handle DT scenarios with multimodal data processing requirements. Finally, current security schemes for DTNs introduce additional overheads that impair the efficiency. Against the above challenges, we propose a large language model (LLM) empowered DTNs framework, LLM‑Twin. First, based on LLM, we propose digital twin semantic networks (DTSNs), which enable more efficient communication and computation. Second, we design a mini‑giant model collaboration scheme, which enables efficient deployment of LLM in DTNs and is adapted to handle multimodal data. Then, we designed a native security policy for LLM‑twin without compromising efficiency. Numerical experiments and case studies demonstrate the feasibility of LLM‑Twin. To our knowledge, this is the first to propose an LLM based semantic‑level DTNs.
LLM-Twin: mini-giant model-driven beyond 5G digital twin networking framework with semantic secure communication and computation / Hong, Yang; Wu, Jun; Morello, Rosario. - In: SCIENTIFIC REPORTS. - ISSN 2045-2322. - 14:19065(2024), pp. 1-21. [10.1038/s41598-024-69474-5]
LLM-Twin: mini-giant model-driven beyond 5G digital twin networking framework with semantic secure communication and computation
Morello, Rosario
2024-01-01
Abstract
Beyond 5G networks provide solutions for next‑generation communications, especially digital twins networks (DTNs) have gained increasing popularity for bridging physical and digital space. However, current DTNs pose some challenges, especially when applied to scenarios that require efficient and multimodal data processing. Firstly, current DTNs are limited in communication and computational efficiency, since they require to transmit large amounts of raw data collected from physical sensors, as well as to ensure model synchronization through high‑frequency computation. Second, current models of DTNs are domain‑specific (e.g. E‑health), making it difficult to handle DT scenarios with multimodal data processing requirements. Finally, current security schemes for DTNs introduce additional overheads that impair the efficiency. Against the above challenges, we propose a large language model (LLM) empowered DTNs framework, LLM‑Twin. First, based on LLM, we propose digital twin semantic networks (DTSNs), which enable more efficient communication and computation. Second, we design a mini‑giant model collaboration scheme, which enables efficient deployment of LLM in DTNs and is adapted to handle multimodal data. Then, we designed a native security policy for LLM‑twin without compromising efficiency. Numerical experiments and case studies demonstrate the feasibility of LLM‑Twin. To our knowledge, this is the first to propose an LLM based semantic‑level DTNs.File | Dimensione | Formato | |
---|---|---|---|
Hong_2024_SciReports_LLM_Editor.pdf
accesso aperto
Descrizione: Versione editoriale
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
5.39 MB
Formato
Adobe PDF
|
5.39 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.