OpenAI's CEO Sam Altman recently revealed on Twitter that the operational costs associated with user interactions on ChatGPT are soaring, nearing tens of millions of dollars. This rise in expenses is largely attributed to the extra computational load generated by users' polite phrases, such as "please" and "thank you" at the end of their queries. Altman explained that these courteous expressions significantly increase the demand on the platform's computing resources, thereby inflating operational costs.
In a lighthearted exchange on social media, one commenter suggested that the issue could be easily resolved by programming the system to automatically respond with "you're welcome." Another user humorously pointed out that if OpenAI truly aimed to conserve electricity, they might consider ceasing the practice of ending every response with a question.
The context of Altman's comments comes as ChatGPT experiences a surge in popularity. In recent weeks, the platform has been riding a wave of interest, particularly due to a trend involving AI-generated content styled after the beloved works of Studio Ghibli. This trend has attracted a significant number of new users, pushing the average weekly active user count to over 150 million for the first half of 2025.
According to a report from Goldman Sachs, the energy consumption associated with ChatGPT-4 is striking. Each query requires about 2.9 watt-hours of electricity, which is nearly ten times the energy needed for a standard search query on Google. With OpenAI processing more than a billion queries daily, the total energy consumption skyrockets to approximately 2.9 million kilowatt-hours each day.
To put this into perspective, the daily energy usage of ChatGPT is equivalent to the monthly electricity consumption of over 10,000 average-sized apartments in Russia, which typically consume between 200 to 300 kilowatt-hours per month. This staggering figure highlights the significant footprint that AI technologies can have on energy resources.
As the demand for AI services continues to grow, the implications of such energy consumption are becoming increasingly critical. The balance between providing a responsive and engaging user experience while managing operational costs and environmental impact is a challenge that OpenAI and similar companies will need to navigate as they expand their platforms.
In light of Altman's revelations, it raises important questions about the sustainability of AI technologies. As user engagement increases, so too does the necessity for companies like OpenAI to explore more efficient operational strategies. This could involve developing smarter algorithms that reduce energy consumption or optimizing server usage to handle the growing load more effectively.
Ultimately, the conversation surrounding the costs of AI interactions is not just about dollars and cents; it's also about the broader implications for the environment and the future of technology. As we embrace these advancements, it becomes essential to consider how we can innovate responsibly.
In summary, while the rise in operational costs associated with ChatGPT is noteworthy, it also serves as a reminder of the intricate relationship between technology, user behavior, and environmental stewardship. OpenAI's ongoing challenge will be to balance these aspects as they continue to lead in the AI space.