/* ---- Google Analytics Code Below */

Tuesday, October 08, 2019

Learning More Efficient Allocation

Consider other possibilities for real time efficiency

Artificial intelligence could help data centers run far more efficiently
MIT system “learns” how to optimally allocate workloads across thousands of servers to cut costs, save energy.

Rob Matheson | MIT News Office

A novel system developed by MIT researchers automatically “learns” how to schedule data-processing operations across thousands of servers — a task traditionally reserved for imprecise, human-designed algorithms. Doing so could help today’s power-hungry data centers run far more efficiently.

Data centers can contain tens of thousands of servers, which constantly run data-processing tasks from developers and users. Cluster scheduling algorithms allocate the incoming tasks across the servers, in real-time, to efficiently utilize all available computing resources and get jobs done fast.   .... "  
                                                      

No comments: