2021 Keynote Speakers
View the Entire 2021 Conference Program
The Potential of Machine Learning for Hardware Design
Monday, December 6, 2021 | 8:40 AM - 9:45 AM
In this talk I'll describe the tremendous progress in machine learning over the last decade, how this has changed the hardware we want to build for performing such computations, and describe some of the areas of potential for using machine learning to help with some difficult problems in computer hardware design. I'll also briefly touch on some future directions for machine learning and how this might affect things in the future.
Jeff Dean (ai.google/research/people/jeff) joined Google in 1999 and is currently a Google Senior Fellow and SVP for Google Research and Google Health. His teams are working on systems for speech recognition, computer vision, language understanding, and various other machine learning tasks. He has co-designed/implemented many generations of Google's crawling, indexing, and query serving systems, and co-designed/implemented major pieces of Google's initial advertising and AdSense for Content systems. He is also a co-designer and co-implementor of Google's distributed computing infrastructure, including the MapReduce, BigTable and Spanner systems, protocol buffers, the open-source TensorFlow system for machine learning, and a variety of internal and external libraries and developer tools.
Jeff received a Ph.D. in Computer Science from the University of Washington in 1996, working with Craig Chambers on whole-program optimization techniques for object-oriented languages. He received a B.S. in computer science & economics from the University of Minnesota in 1990. He is a member of the National Academy of Engineering, and of the American Academy of Arts and Sciences, a Fellow of the Association for Computing Machinery (ACM), a Fellow of the American Association for the Advancement of Sciences (AAAS), and a winner of the 2012 ACM Prize in Computing.
GPUs, Machine Learning, and EDA
Tuesday, December 7, 2021 | 8:40 AM - 9:45 AM
GPU-accelerated computing and machine learning (ML) have revolutionized computer graphics, computer vision, speech recognition, and natural language processing. We expect ML and GPU-accelerated computing will also transform EDA software and as a result, chip design workflows. Recent research shows that orders of magnitudes of speedups are possible with accelerated computing platforms and that the combination of GPUs and ML can enable automation on tasks previously seen as intractable or too difficult to automate. This talk will cover near-term applications of GPUs and ML to EDA tools and chip design as well as a long term vision of what is possible. The talk will also cover advances in GPUs and ML-hardware that are enabling this revolution.
Bill Dally joined NVIDIA in January 2009 as chief scientist, after spending 12 years at Stanford University, where he was chairman of the computer science department. Dally and his Stanford team developed the system architecture, network architecture, signaling, routing and synchronization technology that is found in most large parallel computers today. Dally was previously at the Massachusetts Institute of Technology from 1986 to 1997, where he and his team built the J-Machine and the M-Machine, experimental parallel computer systems that pioneered the separation of mechanism from programming models and demonstrated very low overhead synchronization and communication mechanisms. From 1983 to 1986, he was at California Institute of Technology (CalTech), where he designed the MOSSIM Simulation Engine and the Torus Routing chip, which pioneered “wormhole” routing and virtual-channel flow control. He is a member of the National Academy of Engineering, a Fellow of the American Academy of Arts & Sciences, a Fellow of the IEEE and the ACM, and has received the ACM Eckert-Mauchly Award, the IEEE Seymour Cray Award, and the ACM Maurice Wilkes award. He has published over 250 papers, holds over 120 issued patents, and is an author of four textbooks. Dally received a bachelor's degree in Electrical Engineering from Virginia Tech, a master’s in Electrical Engineering from Stanford University and a Ph.D. in Computer Science from CalTech. He was a cofounder of Velio Communications and Stream Processors.
When the Winds of Change Blow, Some People Build Walls and Others Build Windmills
Wednesday, December 8, 2021 | 8:40 AM - 9:45 AM
Mr. Costello is considered to have founded the EDA industry when in the late 1980s he became President of Cadence Design Systems and drove annual revenues to over $1B—the first EDA company to achieve that milestone. In 2004, he was awarded the Phil Kaufman Award by the Electronic System Design Alliance in recognition of his business contributions that helped grow the EDA industry. After leaving Cadence, Joe has led numerous startups to successful exits such as Enlighted, Orb Networks, think3, and Altius. He received his BS in Physics from the Harvey Mudd College and also has a master's degree in Physics from both Yale University and UC Berkeley.
AI, Machine Learning, Deep Learning: Where are the Real Opportunities for the EDA Industry?
Thursday, December 9, 2021 | 8:40 AM - 9:45 AM
Kurt Keutzer is a Professor of the Graduate School in EECS at University of California, Berkeley, where he is also a member of the Berkeley AI Research (BAIR) Lab and co-director of the Berkeley Deep Drive research consortium. His research covers all aspects of making Deep Learning efficient. His “Squeeze” family of Deep Neural Nets were among the first neural nets suitable for mobile and IOT applications. His collaboration on the LARS and LAMB algorithms reduced the training time of ImageNet and BERT to minutes. Previously, Kurt was CTO at Synopsys, and his contributions to Electronic Design Automation were recognized at the 50th Design Automation Conference where he was noted as a Top 10 most-cited author, author of a Top 10 cited paper, and one of only three people to win four Best Paper Awards in the fifty-year history of that conference. As an entrepreneur Kurt has been an investor and advisor to over 30 startups. His most recent exits have been DeepScale (where he was co-founder), acquired by Tesla, and BabbleLabs (investor and advisor), acquired by Cisco. He was the first investor in Coverity, and he was among the first group of investors in Tensilica, and, more recently, Covariant.