In the rapidly evolving landscape of technology, managing generative AI risks has become paramount for organizations across various sectors. As noted in Deloitte’s recent insights, various risks are manifesting in the marketplace, impacting businesses in ways that require careful navigation.
One of the foremost concerns organizations face is regulatory uncertainty. According to Deloitte’s fourth quarter report on the State of Generative AI in the Enterprise, compliance with regulations has emerged as the top concern for surveyed organizations. Regulatory frameworks can significantly influence how companies utilize data and safeguard privacy. The U.S. government has taken steps in this area, such as the 2023 Executive Order 14110 aimed at ensuring the safe and trustworthy development of AI technologies. However, as policy shifts—like the rescinding of previous executive orders—occur within the government, organizations must stay vigilant and adaptable to the evolving legal landscape.
In tandem with regulatory risks, the technology infrastructure required to support generative AI is under significant strain. The demand for computing power essential for generative AI applications is skyrocketing, putting additional pressure on an already constrained electric grid. Deloitte’s 2025 Power and Utilities Industry survey highlights that data centers—which currently consume between 6% and 8% of total electricity generation in the U.S.—are projected to increase their electricity demand to between 11% and 15% by 2030. This escalating demand poses a challenge for aging infrastructure, which is struggling to keep pace.
The complexities do not end there. Data centers are now grappling with new uncertainties across their value chains. Many industry players, including investment firms and tech infrastructure providers, are attempting to navigate the challenges posed by limited electricity supply. Regions like Northern Virginia and parts of Europe face hurdles in the construction of new data centers due to energy constraints. This situation has led to both risks and opportunities as organizations strive to secure necessary permits and funding for expanding their operations.
Another significant aspect of generative AI risks is application flexibility. Organizations often find themselves locked into vendor agreements that can stifle adaptability. In the race to acquire advanced hardware, businesses risk overspending on outdated technology, as hardware innovations continue to emerge at a rapid pace. With suppliers like NVIDIA facing surges in demand, ensuring access to the latest capabilities has never been more critical. Organizations that limit themselves to a single vendor may miss out on technological advancements that could provide significant competitive advantages.
Organizations are also voicing concerns about achieving value from their investments in generative AI. High upfront costs associated with training and maintaining large models have led one-third of survey respondents to express skepticism about the future marketplace adoption of generative AI. This hesitation underscores the importance of demonstrating ROI to facilitate broader acceptance of these technologies.
To address these marketplace risks, leaders are exploring multiple strategies. Some companies are opting to reduce computational demands by implementing smaller models that lessen the load on electrical resources. While training small language models (SLMs) can entail a higher upfront investment, organizations like Salesforce demonstrate how focused deployments can minimize long-term costs associated with electricity and specialized hardware.
In alignment with this approach, organizations are making strategic infrastructure investments. Many are revisiting their data center strategies to incorporate on-premises solutions alongside cloud services to enhance resilience. However, this dual infrastructure can create its own challenges, particularly when managing energy consumption and cooling requirements. Consequently, hybrid models are emerging as a preferred approach, enabling businesses to balance workloads effectively between on-premises and cloud-based solutions.
Energy consumption remains a pressing concern, particularly as organizations navigate the limitations of public power grids. To mitigate risk, many are now tapping into third-party colocation data centers for their generative AI infrastructure. This shift allows businesses to lean on established facilities while exploring alternative energy solutions, such as microgrids incorporating renewable sources like solar and wind.
Innovative solutions like liquid cooling systems are gaining traction to manage energy use efficiently within data centers, as cooling requirements can account for up to 40% of energy consumption. Meanwhile, organizations are also investigating advancements in edge computing and low-earth orbit data centers, which can alleviate reliance on traditional power grids and reduce latency in data processing.
Building trust and governance frameworks around generative AI is another essential pillar of risk management. Companies are encouraged to establish comprehensive strategies that incorporate third-party risk management and clear operational guidelines to ensure that AI implementations are not only effective but also ethical and transparent.
Ultimately, as the market continues to embrace generative AI, organizations must stay agile and proactive in their approach to risk management. By anticipating regulatory shifts, investing wisely in infrastructure, and employing innovative solutions, businesses can not only mitigate the risks associated with generative AI but also seize the opportunities it presents. As the industry evolves, fostering a culture of awareness and adaptability will be vital in harnessing the potential of this transformative technology safely and effectively.
Source link