DCW Conference Programme 2024
Deploying Sustainable and Large-Scale Computational Infrastructure Tuned for HPC and AI
AI and ML with the investment in Large Language Models (LLM) use cases is driving a large build out and inflection point in the design of hyperscale computational infrastructure that will include a very large number of specialised AI and ML nodes. Current projections on the resource consumption of the AI driven data center buildout indicate we need to change the power, water, and carbon footprint demands of current computational infrastructure. This talk will discuss the latest open innovations being worked by the OCP Community needed to deliver the next generation data center facilities and computational infrastructure to deliver on the promise of AI.