Distributed asynchronous and selective optimization
15
Hyperparameter optimization
16
Environmental footprint
17
Efficiency
18
Community outreach
19
Energy consumption
20
AI workload
21
Perun
22
Model energy consumption
23
Uncertainty quantification
24
Questions
25
Power generation forecasting
26
Attention Matrix sparsity
27
Energy efficiency
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Grab it
Explore cutting-edge research on robust and efficient AI at scale in this 51-minute NHR PerfLab seminar talk by Dr. Charlotte Debus. Delve into the challenges and opportunities of large-scale AI models, focusing on time series forecasting in scientific applications. Learn about transformer architectures, scalability issues, and the balance between computational power and energy efficiency. Discover innovative approaches to address the growing energy consumption of AI, including distributed optimization techniques and hyperparameter tuning. Gain insights into real-world applications such as electric load forecasting and power generation prediction. Examine the environmental impact of AI workloads and explore strategies for developing more sustainable AI solutions. Engage with topics like uncertainty quantification, attention matrix sparsity, and community outreach in the context of advancing AI technologies for scientific research and engineering.