SMT007 Magazine

SMT007-Oct2024

Issue link: https://iconnect007.uberflip.com/i/1527276

Contents of this Issue

Navigation

Page 9 of 83

10 SMT007 MAGAZINE I OCTOBER 2024 Since the introduction of ChatGPT on Nov. 30, 2022, and ChatGPT4 on March 14, 2023, large language models (LLMs) have been in everyday news and conversations. LLMs rep- resent a significant advancement in AI, which has the potential to revolutionize multiple fields. is column offers a snapshot of LLMs from the user's perspective. As a subset of AI models, LLMs are designed to understand, process, and manipulate human language and generate human-like text through learning patterns and relationships. A model is trained on vast datasets, which allow it to recognize, translate, predict, and generate text or other content and perform a wide range of tasks related to natural language processing (NLP). e recent success of LLMs stems from the following: • e introduction of transformer architectures • e capability of increased computational power • e availability and use of vast training data LLMs' underlying technology is based on deep learning, particularly neural networks. Deep learning algorithms are capable of a wide range of natural language tasks. e most com- mon architecture for LLMs is the transformer model, introduced in the groundbreaking paper, "Attention Is All You Need" by Vaswani in 2017 1 . The AI Era, Part 3: LLMs, SLMs, and Foundation Models SMT Perspectives and Prospects by Dr. Jennie S. Hwang, CEO, H-TECHNOLOGIES GROUP

Articles in this issue

Archives of this issue

view archives of SMT007 Magazine - SMT007-Oct2024