Today’s data centers are beginning to appear as if they are bursting at the seams. As a result data center managers are looking for new ways to accelerate an array of new data-driven workloads. These new workloads are arising from a host of IoT and AI-based applications ranging from deep learning to natural language processing (NLP). The following article takes an in-depth look at the agile and hyper-converged data center architectures that are accelerating model training and inference, data analytics and other distributed applications.

Data centers are expanding to the network edge to meet demand by artificial intelligence and other applications requiring fast response times not available from traditional data center architectures.

The problem with traditional architectures is their centralized framework. Data often travels hundreds of miles from the edge to computers, then back again. That’s fine when you’re dealing with email, Google, Facebook and other applications delivered via the cloud. Human brains are slow computers, unable to register the lag time between, say, clicking on an email message in a browser and the message opening.

But AI and other emerging applications — Internet of things (IoT), cloud-based gaming, virtual reality — require much faster network response times, otherwise known as “latency.” That means data center processing must move to the network edge. Edge computing can take place in small data centers, roughly the size of shipping containers, rather than the warehouse-sized edifices that currently power the cloud.

Read the full story on EETimes

 


You may also like

Leave a comment