Knowledge Archive
Knowledge Archive
With the emphasis on size and power of foundation models become larger, some worry about models going rogue. Optimizing energy production, securing the next generation of telecommunications infrastructure, or proving the safety of an aeroplane, these are the everyday problems on which we must today focus our efforts in applying AI. However, only a few firms have intenet-scale data along the lines of Meta or Amazon. Meaning, concentration of wealth and power will inevitably occur. Even though many technologists, executives and sometimes politicians want more powerful and bigger models, there are many who disagree. The foundational models of deep learning are not software composable. Deploying this tech alone in life-critical environments is not currently solvable with just bigger models. #ai #ChatGPT #data ☆ WKF | http://www.wkforum.org/WKF/2021/kr/ ☆ Instagram | https://www.instagram.com/worldknowledgeforum/ ☆ Youtube│https://www.youtube.com/wkforum ✻ World Knowledge Forum's lecture contents are copyrighted by Maekyung Media Group. Acts such as illegal downloading, re-uploading, and re-processing are prohibited.
Prev
Next