The 'New World Order'
BIBLICAL END TIMES COMING:   SOON!
Digital ID Or Digital Prison
 
The New World Order

Top 10: Uses Of AI In Data Centres

We explore how AI in the data centre sector is transforming data facilities, from revolutionising operations to enhancing security and more

The integration of AI in data centres is revolutionising how facilities operate, from providing unprecedented efficiency to maximising security and transforming scalability. AI technologies are stepping in as a vital solution for data centre operators, in streamlining operations and transforming various aspects of data centre management. AI's role in data centres is having an enormous impactful, with the potential still growing.

10. Training & development

According to research by digital infrastructure company Equinix, 62% of global IT decision-makers view a shortage of personnel with IT skills as one of the main threats to their business.

New AI technologies are among the most widely-cited solutions to the current talent crisis. Pioneering talent specialists across the world are utilising AI to help clients achieve a more efficient, supportive and intuitive approach to both onboarding and talent retention.

“In terms of supporting talent, AI can be very effective as a tool supporting augmented reality training scenarios, providing efficient real-time operational analytics and for attracting talent by demonstrating that a business is leveraging new and emerging technologies to provide employees with more interesting and future proof roles,” Mick Lane, the Global Technology Solutions Manager at CBRE.

9. Infrastructure management

In our Top 10 5G Infrastructure Companies we reviewed the leading companies in 5G infrastructure,with a commitment to providing better connectivity for all. Qualcomm has been developing 5G consistently for years. The company’s 5G Advanced services will support new devices, services, spectrum and deployments. It also views 5G and AI as complementary advancing together and mutually benefiting each other in terms of performance and efficiency.

8. Data processing & storage

AI supports data processing and storage for data centres in many ways. AI can continuously monitor system parameters and adjust them to optimise processing performance, which ensures that processing tasks are executed efficiently.

Data can be automatically moved between different storage tiers (such as SSDs, HDDs and cloud storage) based on usage patterns. This ensures that data accessed most frequently is stored on faster, more expensive media, while infrequently accessed data is moved to cheaper, slower media. AI can also analyse and identify redundant data and suggest compression or deduplication techniques to save storage space.

7. Smart cooling

AI and high-performance computing have put increased demand on data centre power. As a result, data centres have had to find ways to keep themselves cool, which has led to the development of smart cooled solutions. To paraphrase Plato, necessity is the mother of invention.

Smart cooling uses AI and machine learning to adjust cooling parameters, unlike traditional air and liquid cooling which rely on predefined settings and manual adjustments.

Smart cooling optimises energy use by predicting conditions and responding to them in real-time. Smart cooling can also adapt to changes in server workload and environmental conditions automatically, while traditional systems are less adaptable.

6. Power management

AI supports power management in data centres through a variety of ways. AI algorithms can allocate computing tasks based on power availability, while also making sure that power usage is balanced across the data centre. AI can also predict power usage and move non-critical tasks to off-peak times, in order to avoid excessive energy consumption and reduce costs. Data on energy consumption patterns can be collected and analysed by AI, providing insights into how power is being used and where efficiencies can be gained.

Energy performance across different parts of the data centre can be compared, identifying areas for improvement and helping set benchmarks for energy efficiency.

5. Predictive maintenance

Predictive maintenance refers to a more preemptive approach to data centre operation by utilising AI technology to predict what needs repairing. According to Deloitte, predictive maintenance is able to increase enterprise productivity by 25%, reduce breakdowns by 70% and lower maintenance costs by 25% - as opposed to reactive maintenance.

“AI needs an enormous amount of data to learn and evolve, so as data centres grow, more data will be available for AI to use,” said Flora Cavinato, Global Service Product Portfolio Director at Vertiv. “Predictive maintenance will naturally scale, which means that the more dense and varied data population there is, the better the data trending, pattern recognition, insight learning and predictions.”

4. Network automation

AI supports network automation in data centres by enhancing efficiency, reliability and performance in a range of ways. AI can automatically configure network settings based on real-time traffic, which ensures optimal network performance and reduces manual intervention. AI-driven systems can support network policies across the data centre, updating configurations automatically to comply with security policies.

AI also allows for automatic scaling of network resources in response as workloads change. This makes sure that the network can handle different levels of traffic, without manual intervention.

3. Resource optimisation

Hyperscale data centres are built by companies with vast data processing and storage needs. These firms may derive their income directly from the applications or websites the equipment supports, or sell technology management services to third parties. As they are significantly larger than enterprise data centres, they are an attractive choice for data centre operators.

Many big hyperscalers are exploring new technologies, such as AI, to further improve their data centre efficiency. For example, Meta is exploring how AI and machine learning can help it to optimise its data centre operations, especially surrounding the construction of a new data centre campus in Singapore, which will be one of the largest in the region.

2. Security

Security is a big factor in every industry, from factory floor to the CEO’s office. The use of AI in data centres strengthens security in several ways:

AI can monitor network traffic and user behaviour to spot anomalies which may suggest a security threat.

AI-powered intrusion detection and prevention systems can identify threats effectively, by adapting to new data of emerging threats.

Following a security incident, AI can automate a response, reducing the time taken to contain threats. This can help to isolate affected systems and block malicious IP addresses.

Within the data centre infrastructure, AI can help to identify vulnerabilities. Machine learning models can suggest which vulnerabilities are most likely to be targeted allowing for immediate action.

Through the automation of security tasks, AI minimises the need for manual intervention, lowers operational costs and allows security teams to put their energy into strategic activities.

1. Sustainability

Data centres which support AI are faced with the challenge of higher computational power, prompting greater changes in facility design, newer cooling technologies and greater innovation in carbon footprint reduction.

As the data centre sector is called upon to confront its sustainability challenges, Data Centre Magazine spoke with Tom Kingham, Design Lead Europe & Japan at CyrusOne. He explained how data centres are starting to evolve into renewable energy developers to confront the energy efficiency challenge, in addition to how facilities are looking to be more flexible - both digitally and environmentally.

According to Tom, data centres should continue to address the unique demands of AI workloads, including the need for ultra-high-density power usage, advanced cooling technologies and scalable designs.

“This has been an ongoing priority for CyrusOne, guiding the design of current and future projects to effectively scale their AI infrastructure,” he said. “Inevitably, we are seeing a stronger focus on sustainability, with innovations reducing physical space requirements and carbon footprints, reflecting a broader trend towards greener data centre operations.”


Also:


Data centers are sprouting up as a result of the AI boom, minting fortunes, sucking up energy, and changing rural America

It's less than a year since ChatGPT launched to the public, triggering a boom in artificial intelligence investments, and forever changing our comprehension of the technology.

And while the rise of AI has already changed our digital realities, it's also beginning to impact our physical world, too.

The AI boom has supercharged a wave of spending on data centers. This building boom is sucking up billions of dollars, along with water, land, and energy.

"There's a well-publicized arms race happening in AI, and the major tech companies are expected to invest $1 trillion over the next five years in this area, mostly to data centers," Jonathan Gray, Blackstone's president and chief operating officer, said on an earnings call on July 20.

The data center boom is set to double or triple the amount of energy consumed by these data centers.

By 2030, data centers are expected to reach 35 gigawatts of power consumption annually, up from 17 gigawatts last year, according to McKinsey. A recent Cowen research report estimated that AI data centers could require more than five times the power of traditional facilities.

"It's staggering," said Marc Ganzi, CEO of DigitalBridge, an investment firm that owns and operates data centers, fiber networks, and other tech infrastructure. "Not sure how we do it."

Insider has been covering the data center boom at length. Here, we break down what's going on: 

AI is supercharging the data center boom

AI model training is especially energy intensive. It requires the use of graphic processing units. These GPUs are specialized chips that multitask better and work faster than central processing units, or CPUs, which run most traditional cloud services.

AI computer servers being installed in data centers are often equipped with multiple GPUs, usually supplied by Nvidia. Each GPU consumes up to about 400 watts of power, so one AI server can consume 2 kilowatts. A regular cloud server uses 300 to 500 watts, according to Shaolei Ren, a researcher at UC Riverside who has studied how modern AI models use resources.

For instance, the power consumption of Nvidia's new GH200 server cluster is about two to four times more than a regular cluster of the same physical size, he estimated.

Tom Keane, who oversaw Microsoft's cloud data centers for about two decades, recently warned about this. Microsoft is the main backer of OpenAI, creator of ChatGPT and builder of the most powerful AI models around.

"In the case of a training data center, you make that as big as possible, you put as many computers in there as possible and you're running that data center at full utilization all of the time," Keane told Bernstein analyst Mark Moerdler in a recent interview. "That physical data center starts to become more resource intensive. You start to design that AI data center very differently."

Rural America has become a hotbed

On a rolling expanse of rural Ohio land, America's digital future is being sown.

Last year, a partnership between the real estate investors Lincoln Property Company and Harrison Street purchased 190 acres in New Albany, a small city about 20 miles outside of Columbus where the pair plan to begin construction on a 200-megawatt data center by the end of the year.

The project's neighbors include Google, Meta, Microsoft, and Amazon – all of whom have similar plans, or are already underway with major data center projects.

"Our regional message is if you're a major data center developer or customer, we want to talk to you," said Matt McCollister, an executive vice president at One Columbus, a business development group in the region.

The data center industry has long been clustered in a handful of well-established markets, primarily Northern Virginia, Dallas, Phoenix, Silicon Valley, and Chicago. But the emergence of places like New Albany shows how soaring demand and the sector's voracious appetite for energy is increasingly pushing data center developers and users throughout the country.

Artificial intelligence, which requires massive computing power and energy loads, is expected to further this migration – especially as utilities in the industry's core markets have struggled to keep pace with its growth.

Wall Street is betting big

To help it reach $1 trillion in assets, Blackstone has wagered big sums on apartment buildings, warehouses, student housing, and other commercial real estate assets that proved to be shrewd investments.

Now, as borrowing costs rise, property values sink, and a once-soaring real estate market has become perilous, the investment giant is turning to ChatGPT for answers — literally.

Blackstone has been talking up data centers with expectations that the industry will benefit from a boom in artificial intelligence and become a key new area of focus in its $585 billion real estate portfolio.

Two years ago Blackstone funds, including its largest single real estate fund, Blackstone Real Estate Income Trust, known as BREIT, took the data center landlord and developer QTS Realty Trust private in a $10 billion deal that has made QTS the centerpiece of its data center ambitions.

Blackstone executives have said QTS's value has since tripled and that they plan to radically scale it beyond that with $8.5 billion of new data center projects due in the next three years and a $50 billion pipeline of longer-term development.

These complexes are hoovering up huge amounts of energy

Just a few miles from the $5 billion second headquarters that Amazon is raising outside of Washington, DC, the tech giant is in the midst of a far larger, and less conspicuous, building boom.

The company is in the process of developing $87 billion worth of data centers, a push that has already made it the biggest player in the world's largest data center market in northern Virginia. The featureless, warehouse-like structures are easy to miss on the sides of highways or tucked unassumingly amid suburban neighborhoods.

Data centers, including Amazon's, play an increasingly central, but unseen role in modern life, housing the digital infrastructure that powers critical functions such as e-commerce, autonomous vehicles, video streaming, and, now, artificial intelligence.

There is a flipside, however, to their now ubiquitous presence in places like northern Virginia. The facilities consume quantities of power so vast that they have begun to tax entire energy grids and could exacerbate the climate crisis.

Amazon does not disclose how many data centers it occupies, where they are located, or how much electricity they consume. The company's data facilities are tied to its large cloud computing business, Amazon Web Services, which offers software, storage, and other services to legions of customers.  

Based on a review of permits that Insider obtained through a Freedom of Information Act request, Amazon operates, or is in the process of building or planning, 102 data centers in northern Virginia. Together, the facilities, when they are all up and running, will have emergency generators capable of producing more than 4.6 gigawatts of power. That's almost enough backup electrical capacity to light up all of New York City on an average day.

That's forcing some utilities to delay their shift away from fossil fuels

In Phoenix and the surrounding region, data centers have attracted attention for the noise they blare, the water they guzzle, and the large tracts of land they've consumed.

Now, the prodigious power burned by this fast-growing industry threatens to overwhelm the city's utilities and stymie efforts to remove fossil fuels from the grid even as the climate crisis has flashed increasingly dire signals. This year, the daily high temperature in Phoenix reached or surpassed 110 degrees Fahrenheit for a record 54 days.

"We have about 7,000 megawatts of data center requests currently in our pipeline," Karla Moran, an executive at Salt River Project, one of two major utilities that serve the Phoenix region, told Insider. Moran noted those requests rivaled the size of the utility's entire 11,000 megawatt system.

While all of those power requests are not expected to materialize into actual development, she said the size of the interest was unprecedented. The data center industry's soaring electrical demand has been so significant, it factored into the utility's far-reaching plans, Moran said.

In October, the power company, better known as SRP, approved a significant expansion of its generating capabilities that includes the development of 2,000 megawatts of new methane gas facilities. Those plants will effectively preserve the size of its portfolio of fossil-fuel-fired infrastructure into the next decade and potentially beyond.

Moran said data centers, with their heavy, around-the-clock power use, were "one of the main reasons we look at having a resource like that."