ML engineer Shashank Kapadia from Walmart discusses the 3x growth in AI model parameters over 10 years and its implications. Key topics include why GPUs excel at parallel processing for deep learning (originally designed for gaming), the importance of scaling AI smartly rather than just bigger, and the ethical concerns around bias and transparency as models grow more complex. He also addresses carbon footprint considerations when designing efficient architectures and views AI as a productivity assistant rather than a developer replacement.
Table of contents
How GPUs excel at parallel processing for deep learning, and why scaling smartly beats scaling bigger.What is a GPU?Key takeawaysMore from We Love Open SourceAbout the AuthorSort: