Headlines This Week
- A few of the largest names in tech met with Chuck Schumer in Washington D.C. this week for a close-door summit designed to tell future AI coverage. The visitor listing included Elon Musk, Mark Zuckerberg, Invoice Gates, and different billionaires who stand to profit from a lax regulatory setting.
- Coca Cola has a brand new taste that was created by AI and I’m genuinely curious what it tastes like. I wager it sucks.
- Manufacturers are more and more foregoing human fashions and choosing AI-generated “fashions.” Perhaps it’s time for model ambassadors to unionize?
- Final however not least: Insider author and tech blogger Ed Zitron wrote an op-ed suggesting that AI might be used to automate the function of the company CEO. We talked with him for our interview this week.
The Prime Story: AI’s Water Guzzling Behavior

It’s no secret that the tech business has a water downside. Knowledge facilities, that are integral to our extremely digitized world, should be cooled on the reg to run correctly. Problematically, cooling processes require immense quantities of contemporary water, a lot of which needs to be sucked out of native U.S. water techniques. It in all probability comes as no shock that the rising AI business, vastly power intensive as it’s, is one of many thirstiest in Silicon Valley.
That thirstiness was affirmed this week when Microsoft launched its newest environmental report, the likes of which confirmed that its water utilization had skyrocketed between 2021 and 2022. The report, which tracks the interval when the corporate’s AI operations started to speed up, confirmed that Microsoft had burned by means of some 6,399,415 cubic meters of water in a 12-month interval—a couple of 30 % enhance from the speed of the earlier 12 months.
The findings aren’t precisely stunning. A research printed earlier this 12 months by the College of California Riverside estimated that it takes as much as half a liter—or roughly a bottle—simply to speak to ChatGPT for a short time. Worse, the research additionally projected how a lot water Microsoft had used to coach GPT-3 over a two week interval: roughly 700,000 liters. The research famous the “extraordinarily regarding” nature of those findings, provided that “freshwater shortage has develop into one of the vital urgent challenges” of our time.
One of many research’s authors, Shaolei Ren, instructed Gizmodo this week that AI is far more energy-intensive than most different types of computing. “The power density of AI servers are usually greater than different forms of servers as a result of they’ve a number of GPUs and, for every server, they’ll devour as a lot as two to 3 kilowatts of energy, whereas regular servers usually devour under 500 watts. So there’s a big distinction by way of their power density, which suggests that there’s additionally a distinction of their cooling wants,” stated Ren.
There are strategies that tech firms can take to scale back the quantity of water that they’re utilizing to coach these fashions, stated Ren. Sadly, additional oversight of whether or not the businesses are doing this or not is hard since a lot of the AI distributors don’t launch the associated information publicly, he stated.
The Interview: Ed Zitron, on Methods to Automate Your C-Suite

This week we had the pleasure of talking with Ed Zitron. Along with being the founding father of his personal media relations agency, Zitron has a tech-focused Substack (“The place’s Your Ed At”), and can be a contributing author for Insider. This week, Zitron wrote an op-ed humorously suggesting that firms ought to exchange their CEOs with AI. Executives didn’t adore it. We spoke with Zitron about AI, labor, and the present foibles of company governance. This interview has been edited for brevity and readability.
For individuals who haven’t learn your op-ed, they need to clearly simply do this. However I wished to provide you a chance to make your case. So, simply briefly, what argument are you making on this piece? And why ought to we exchange company executives with ChatGPT?
The argument I’m largely making is that the CEO has develop into an especially obscure function. It’s develop into one with little or no accountability, little or no within the sense of a definitive set of obligations. When you have a look at the essential literature across the CEO function, it’s truly not that apparent what they do. There was a Harvard research from 2018 the place they appeared into what they have been doing and it was like “individuals,” “conferences,” “technique.” That would imply something—fairly actually something! “Technique”? What does that imply? So, CEOs seem like simply going into conferences and saying, ‘We should always do that’ or ‘we shouldn’t do this.’ The issue is that in case your solely function in a corporation is to take data and go ‘eh, we must always do that’ and also you’re not a lawyer or a physician or somebody with an actual, precise ability set, what’s the goddamn level?
What kind of responses have you ever gotten out of your piece up to now?
Everyone on Twitter appeared pleased with it, whereas individuals on LinkedIn have been break up 50-50. When you say something unfavourable about executives on LinkedIn, a number of guys who aren’t executives get very pissed off. (And it’s at all times guys, btw—males appear actually delicate about this topic.) However there’s nonetheless a very good quantity of people that suppose, yeah, if there’s a chief govt who has a obscure function the place they don’t truly execute—the place they do stuff that isn’t truly related to the product however they nonetheless receives a commission a ridiculous sum of money—possibly we do have to automate them! Or possibly we have to extra clearly outline their function and maintain them accountable for that function and fireplace them in the event that they carry out poorly.
What do you suppose the probabilities are that firms will take you up in your recommendations right here?
Oh, extraordinarily low. Simply to be abundantly clear I don’t suppose a single goddamn firm does this. That’s why I supply another within the piece, which is that we want working CEOs. Me, personally, I do a number of the leg work at my very own enterprise. I might say I do greater than my justifiable share. However, additionally, why would you’re employed for me if I didn’t? That’s what I’ve by no means understood about these CEOs that don’t work. It’s like, I can perceive an editor that doesn’t write however an editor that’s by no means written or by no means writes? An editor who simply sits there and makes calls? Or an govt editor? Or, I don’t know, some form of non-public fairness man who buys a big group however doesn’t appear to have any appreciation for what goes on there, after which proceeds to make a bunch of actually silly calls…that’s the place you run into issues.
That’s what my Insider piece was about, mainly. Executives appear disconnected from work-product. It’s a elementary situation.
I’m interested by what you make of generative AI and the way the manager class appears to be weaponizing it in opposition to employees?
Generative AI is hilarious as a result of it has the looks of intelligence with out truly having any. It’s the proper form of McKinsey-level advisor; it simply regurgitates content material primarily based on a sure subset of information. It doesn’t convey life expertise to what it does. It doesn’t create something new. It’s not studying or considering. It’s mainly simply taking an enormous field of Legos and attempting to create one thing, utilizing no precise creativity, with a tough approximation of what it thinks a home seems to be like.
There’s a number of mystification round AI and there’s all this rhetoric about the way it’s going to “change the world.” However actually, whenever you get proper right down to it, AI is mainly being pitched to firms as a cost-saver, as a result of it presents them the chance to automate a sure share of their workforce.
This relates again to what we have been speaking about earlier. When you’ve gotten executives and managers who’re disconnected from the technique of manufacturing—or the method of manufacturing—they’ll make calls primarily based solely on value, output, and velocity, as a result of they don’t truly perceive the manufacturing course of. They don’t know what’s occurring contained in the machine. The one issues they see is what goes within the pipeline and what comes out the tip they usually take note of how briskly it’s occurring.
Make amends for all of Gizmodo’s AI information right here, or see all the most recent information right here. For day by day updates, subscribe to the free Gizmodo publication.