OpenAI's chatbot ChatGPT is powered by over 10,000 Nvidia graphics processing units (GPUs) 'A100'.
To handle massive amounts of computations quickly, many GPUs are required, which inevitably entails high power consumption and costs.
Ultimately, reducing the inefficiency of such semiconductors and power consumption is crucial for rapidly applying artificial intelligence (AI) as a driving force in all industries.
Amidst this, attention in the industry is drawn to the success of KAIST (Korea Advanced Institute of Science and Technology) researchers in developing the world's first ultra-low-power, ultra-high-speed AI semiconductor, called the 'Transformative Transformer'.
On the 6th, the Ministry of Science and ICT announced that Professor Yu Hoi-Jun's research team at the KAIST AI Semiconductor Graduate School and the KAIST PIM Semiconductor Research Center have developed the world's first AI semiconductor called the 'Transformative Transformer'.
It consumes only 400 milliwatts (mW) of ultra-low power and can process giant language models (LLMs) in just 0.4 seconds.
The Transformative Transformer is a technology that implements Transformer functionality by selectively using 'Spiking Neural Networks' (SNN), a type of neuromorphic computing system designed to mimic the structure and function of the human brain, and 'Deep Neural Networks' (DNN), which are deep learning models used for processing visual data.
SNN operates by processing information using the time-dependent signals, called spikes, similar to the neurons in the brain.
In essence, it enables AI processing capabilities by implementing human brain neurons into semiconductors.
The Transformer is a neural network that tracks relationships within data, such as words in a sentence, to learn context and meaning.
It is the foundational technology behind ChatGPT.
Until now, running LLMs in ChatGPT required a large number of GPUs and consumed 250 watts (W) of power.
Operating AI based on ultra-large models that require countless computations in a short time necessitated a massive quantity of highly efficient semiconductor chipsets.
GPUs, originally not designed for AI computations, came at a high cost in terms of power efficiency, area, and execution time.
Even Sam Altman, the CEO of OpenAI, recently mentioned that "one ChatGPT usage costs a few cents."
The reason domestic companies and the government in South Korea have embarked on the development of AI semiconductors is also due to these high-power and high-cost issues.
With the expansion of AI services, there is a high possibility that the demand for low-power, high-efficiency semiconductors will skyrocket. In the industry, there is anticipation that the 'Transformative Transformer', developed through Samsung Electronics' 28-nanometer process, could pioneer a new AI semiconductor market once it is refined and commercialized.
So, we've been checking out the new AI chip at KAIST, and it's looking pretty good for people who are interested in Korean tech stocks.
With KAIST leading the AI chip tech in the works, there's potential for some serious growth.
Investing in Korean tech stocks could pay off big time, especially with the AI boom happening worldwide.
Keep an eye on what's happening and think about adding these stocks to your portfolio!
#KAIST #AIsemiconductors #Koreantech #Investing #ScienceStocks #TechInnovation #AIboom #FutureTech #KAISTresearch #LongTermGains #카이스트 #반도체
All rights reserved Stay14 Bespoke
Intelligentsia Coffee Opens Its First Global Store in Seoul's Seochon (0) | 2024.03.09 |
---|---|
Adobe Photoshop 2024 rides the wave of generative AI (0) | 2024.03.08 |
Introducing 4 Recommended Samgyeopsal Restaurants to Visit on Samgyeopsal Day (0) | 2024.03.03 |
Exhuma' Breaks Box Office Records Surpasses 5 Million Viewers on the 10th Day of Release (0) | 2024.03.02 |
Flower-Sniffing Cold (꽃샘추위) (0) | 2024.03.01 |
댓글 영역