Issue link: https://iconnect007.uberflip.com/i/1518339
12 SMT007 MAGAZINE I APRIL 2024 Operating AI demands the use of heavy-load hardware that processes algorithms, runs the models, and keeps data flowing. ese band- width-hungry applications necessitate higher- speed data transfer, which opens a crucial role for photons by taking advantage of the speed of light to deliver greater bandwidth and lower latency and power. Hardware compo- nents typically will connect via copper inter- connects, while the connections between the racks in data centers oen use optical fiber. CPUs and GPUs also use optical interconnects for optical signals. Both electrons and photons will play an increased role. AI will drive the need for near- packaged optics with high-performance PCB substrates (or an interposer) on the host board. Co-packaged optics, a single-package integra- tion of electronic and photonic dies, or photonic integrated circuits (PICs) are expected to play a pivotal role. AI Market and Hardware To AI, high performance hardware is indis- pensable, particularly with computing chips. As AI becomes embedded in all sectors of industry and all aspects of daily life and busi- ness, the biggest winners so far are hardware manufacturers: 80% of AI servers use GPUs and it's expected to grow to 90%. In addition to GPU, the required pairing memory puts high demand for high bandwidth memory (HBM). e advent of generative AI further thrusts accelerated computing, which uses GPUs along with CPUs to meet augmented performances. Although the estimated forecast of the future AI market varies, according to PwC 1 , AI could contribute more than $15 trillion to the global economy by 2030. Most agree that the impact of AI adoption could be greater than the inven- tions of the internet, mobile broadband, and the smartphone combined. AI Historical Milestones AI is not a new term. John McCarthy coined "artificial intelligence" and held the first AI conference in 1956. "Shakey the Robot," the first general-purpose mobile robot, was built in 1969. In the succeeding decades, AI went through a roller coaster ride of successes and setbacks until the 2010s, when key events, including the introduction of big data and machine learning (ML), created an age in which machines have the capacity to collect and process huge sums of information too cumbersome for a person to process. Other pace-setting technologies— deep learning and neural network—were intro- duced in 2010, with GAN in 2014, and trans- former in 2017. e 2020s have been when AI "finally" gained traction, especially with the introduc- tion of generative AI, the release of ChatGPT on Nov. 30, 2022, and the phenomenal Chat- GPT-4 on March 14, 2023. It feels like AI has suddenly become a global phenomenon. e rest is history. AI Bedrock Technologies Generally speaking, AI is a digital technol- ogy that mimics the intellectual, analytical, and creative ability of humans, largely by absorbing and finding patterns in an enormous amount of information and data. AI covers a multi- tude of technologies, including machine learn- ing (ML), deep learning (DL), neural network (NN), natural language processing (NLP), and their closely-aligned technologies. In one way, AI hierarchy can be shown in Figure 1, exhib- iting the interrelations and evolution of these underpinning technologies. Now I'd like to briefly highlight each tech- nology: Machine Learning Machine learning is a technique that col- lects and analyzes data, looks for patterns, and adjusts its actions accordingly to develop sta- tistical mathematical models. e resulting algorithms allow soware applications to pre-