When Reality Hits: Sometimes the most important insights come from those "oh shit" moments that hit you when you're trying to disconnect from technology. Just ask Lionel Ringenbach, who had his epiphany while heading off to a 10-day tech-free retreat. He'd left an AI model training at home, and halfway down the highway, existential dread kicked in – what kind of energy bill was he driving away from? That moment of panic led Lionel down a rabbit hole that would eventually shine a harsh light on one of AI's dirtiest secrets: we have no fucking clue how much power this revolution is actually consuming. At VAI13, our monthly Vancouver AI meetup, Lionel dropped some truth bombs that had the room buzzing. Not with the usual techno-optimist hype that floods these spaces, but with a sobering reality check about the actual cost of our AI addiction. Here's the thing about Lionel – he's not some doom-and-gloom prophet trying to kill the AI vibe. He's one of us, a builder, a maker, someone who's spent years pushing the boundaries of what's possible with technology. But he's also the rare voice willing to ask the uncomfortable questions that the big players would rather ignore. Let's talk numbers, because Lionel did something revolutionary – he actually ran the math instead of recycling the same vague statistics that float around Tech Twitter. Take ChatGPT, that digital oracle we're all getting cozy with. Sam Altman proudly announced they're handling a billion requests daily. Cool story. But Lionel wanted to know: what's powering all those clever responses? Using his own GPU setup, he ran tests to measure the power draw of different types of queries. The results? Each ChatGPT interaction burns between 1.46W and 4.4W. Doesn't sound like much until you multiply it by a billion. Suddenly you're looking at the equivalent energy consumption of 140 to 1,560 U.S. homes – every single day. Here's a visualization that hit home: every time you're waiting for ChatGPT to respond, imagine 4 to 8 microwaves running at full blast. That's not some abstract cloud computation – that's real power being consumed in real-time, generating real heat, requiring real cooling, creating real environmental impact. But here's where it gets murky. The big players – Microsoft, OpenAI, Meta, Google, Amazon, XAI – they're all playing their cards close to their chests. Sure, they'll brag about their massive GPU clusters (looking at you, Elon, with your casual mention of 200,000 GPUs), but actual power consumption? Radio silence. We're talking about data centers running thousands of GPUs, each pulling 700W to 1000W, 24/7. The latest NVIDIA beasts are power-hungry monsters, and they're being deployed by the thousands. Conservative estimates put the total number of AI-dedicated GPUs globally between 1 and 3 million. That's a lot of microwaves, folks. The real kicker? Data centers already account for 1-1.5% of global electricity use and about 1% of global CO2 emissions. Some studies suggest AI might be responsible for 2.1-3.9% of total CO2 emissions. But we don't know for sure because, as Lionel pointed out, we're essentially flying blind. Drawing from his experience in cloud computing, Lionel confirmed what many of us suspected – even inside these companies, energy monitoring is an afterthought. They track every conceivable performance metric except the one that might matter most for our planet's future. But Lionel isn't just dropping problems in our laps – he's pushing for solutions. He's advocating for tools like CodeCarbon, a free, open-source way to monitor AI energy usage. He's calling for transparency from cloud providers, pointing out that if hardware manufacturers like NVIDIA and Intel can provide clear power metrics, why can't AWS and Google Cloud? The impact of Lionel's research on our Vancouver AI community has been profound. It's shifted the conversation from purely what AI can do to what it costs us to do it. It's forced us to confront the reality that every prompt, every model training run, every piece of generated content comes with an energy price tag we're not measuring. As someone who's been riding these technological waves since the early days of the web, I've seen how easy it is to get caught up in the possibilities while ignoring the consequences. Lionel's work is a crucial reality check – not to stop innovation, but to make sure we're building sustainably. There's something beautifully fitting about this research coming from Vancouver. We're sitting here in British Columbia, blessed with relatively clean hydroelectric power, while much of the world's AI runs on far dirtier energy sources. We have both the privilege and responsibility to push for better practices. As Lionel prepares to head to France, he leaves us with a challenge: start measuring, start monitoring, start caring about the energy impact of our AI experiments. Because if we don't get this right, all our clever algorithms and neural networks won't mean much on a planet we've cooked in the process. This isn't about doom-scrolling our way into paralysis – it's about facing reality so we can build better solutions. The tools exist. The knowledge exists. We just need the will to use them. As I watched Lionel break down these numbers at VAI13, I couldn't help but think: this is exactly the kind of clear-eyed analysis our community needs. No hype, no fear-mongering – just solid research and a call to action that cuts through the noise. Because at the end of the day, the future of AI isn't just about what we can make these systems do – it's about building them responsibly enough that we have a future to build them in. Lionel, you'll be missed in Vancouver, but your research has lit a fire under our community that won't be easily extinguished. Keep pushing for transparency, keep asking the hard questions, and keep making us confront the real costs of our digital dreams.


Introduction

Lionel Ringenbach, known in the artistic community as Ucodia, is a computational experimental artist and interactive designer who has been pushing the boundaries of human-machine creativity. Through his innovative work at the intersection of technology and art, Ringenbach creates unique, participatory experiences that emphasize playful interactions between humans and artificial intelligence.

The Creative Vision Behind CompoVision

At the heart of Ringenbach's recent innovations is CompoVision, a groundbreaking installation that redefines how we interact with AI systems. This project eliminates the need for traditional prompt engineering, instead creating a fully visual interaction system that makes AI accessible and engaging for everyone.

Technical Innovation

The technical architecture of CompoVision demonstrates Ringenbach's expertise in combining multiple technologies:

Impact on the AI Art Community

Through his work with the Vancouver AI Meetup Community and various exhibitions, Ringenbach has demonstrated how AI can be made more accessible and engaging for the general public. His installations have not only showcased the technical possibilities of AI but have also sparked important discussions about the future of human-machine collaboration in creative spaces.

Looking Forward

As AI technology continues to evolve, Ringenbach's work stands as a testament to the possibilities that emerge when creativity meets technical innovation. His contributions to the field of computational art are helping to shape a future where technology and human creativity work in harmony to create new forms of artistic expression.