As the US Grid Hits Its Limits, EnCharge AI offers another way forward.

The United States power grid is increasingly strained. The Wall Street Journal recently reported that tech giants can’t afford to wait for electricity, prompting some to move into the power business themselves to meet growing AI demand.

This raises an alternative question: instead of focusing solely on generating more power, what if the industry also prioritised using less of it?

EnCharge AI is exploring that approach with an analog in-memory computing architecture designed to reduce the energy costs associated with data movement in AI inference—one of the more resource-intensive aspects of running AI models.

The company’s Chief Scientist, Shwetank Kumar, recently published a piece titled “The Efficiency Imperative: Energy Will Define AI’s Next Chapter,” discussing the role energy efficiency may play in the future of AI development.

The article looks at energy efficiency challenges across client devices, data centers, and physical AI applications, and outlines how EnCharge’s technology aims to address them at various scales.

Read the full article here.