Artwork

Innhold levert av Asim Hussain and Green Software Foundation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Asim Hussain and Green Software Foundation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
Player FM - Podcast-app
Gå frakoblet med Player FM -appen!

We Answer Your Questions!

56:04
 
Del
 

Manage episode 367903804 series 3336430
Innhold levert av Asim Hussain and Green Software Foundation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Asim Hussain and Green Software Foundation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
On this episode of Environment Variables, host Chris Adams is joined by Asim Hussain as they dive into a mailbag session, bringing you the most burning unanswered questions from the recent live virtual event on World Environment Day that was hosted by the Green Software Foundation on June 5 2023. Asim and Chris will tackle your questions on the environmental impact of AI computation, the challenges of location shifting, the importance of low-carbon modes, and how to shift the tech mindset from "more is more" (Jevons Paradox). Chock-full of stories about projects implementing green software practices, and valuable resources, listen now to have your thirst for curiosity quenched!
Learn more about our people:

Find out more about the GSF:

Questions in the show:
  • What computation is needed for AI, the explosive use, and what impact will that have on the environment? [7:17]
  • Regarding location shifting - is the foundation concerned that when everyone time shifts to the same location or the same greener grids, that can increase the demand of those grid's energy, which could increase fossil fuel burning to meet said new demand? [18:50]
  • Why not just run low-carbon mode all the time, not just when the carbon intensity is high on “dirty electricity”? [34:35]
  • Given the Jevons’ Paradox, how do we change the thought pattern that more is more in tech? [38:15]
  • Are there any notable examples of organizations or projects that have successfully implemented green software practices? What can we learn from them? [49:00]

Resources:

If you enjoyed this episode then please either:


TRANSCRIPT BELOW:
Asim Hussain: We're talking about a cultural change that will take generations is I think what it would really take. I don't think this is gonna happen in our lifetimes. I think the world that you've described is a beautiful world. I hope to, I hope my dream it will exist, and I think it'll only exist if the culture changes and the culture changes worldwide and dramatically.
Chris Adams: Hello, and welcome to Environment Variables brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software. I'm your host, Chris Adams.
Welcome to a special mailbag episode of Environment Variables. We're thrilled to bring you the most anticipated questions that arose during a recent live virtual event hosted by the Green Software Foundation on World Environment Day on the 5th of June with over 200 passionate practitioners participating from all around the world, the event featured an expert panel consisting of influential voices in the green software movement.
Our panelists at the time included Asim Hussain, who's here today. Hey, Asim.
Asim Hussain: Heya.
Chris Adams: We also had Anne Currie, the Greentech advocate at Container Solutions and Community Chair at the GSF. We also had Tamara Kneese, the UX research leader and strategist and lead researcher of the State of Green Software Report, as well as Pindy Bhullar, the ESG Chief Tech Technology Officer at the bank, UBS and the PhD in her own right. During the event, they introduced the Green Software Foundation, and unveiled all kinds of insights from the recently published state of Green software report, which sparked all kinds of engaging discussions. But we didn't really have enough time to cover all the questions that were coming in from the people who are asking there.
So you can think of today as a bit of a kind of roundup of some of the questions that seemed particularly interesting and felt like they probably need a bit more time to actually delve into them properly. So that's the plan for today. We're gonna look into some of the questions that we didn't have time to answer or I wasn't actually there so I didn't have a time to answer, but that's what we're doing.
So today it's myself, Chris Adams of the policy chair at the Green Software Foundation, and I'm joined by Asim Hussein, the chair and executive director of the Green Software Foundation and lover and grower of mushrooms. And Asim. I'll live some space for you if you wanna talk about anything in particular, cuz I have some interesting mushroomy factoids I'll share with you after this.
Asim Hussain: It's just the way you said lover of mushrooms was a little bit, um, risque. But yeah, I only love them cuz I like to eat them. But yeah, labro, I, I am a mycologist. I grow mushrooms and currently actually very desperately trying to rescue two bags of a lion's mane mushrooms, which I think I've let rot in our bags a bit too long.
So hopefully I'll have some lion's manes mushrooms this time next week.
Chris Adams: That sounds cool. Can I share my fact about mycelium, which I think is really cool. So I listened to a podcast called Catalyst with Shayle Kann, and he recently did an episode where he was talking to one of these researchers in South Africa who published some research about what I referred to as Microrisal-
uh, The fungus. Basically it's the kind of fungus that you can think of being attached or interfacing with the roots of a tree. And they did some research for the first time to get an idea of how much CO2, how much carbon is actually sequestered by this. Cuz there is a kind of symbiotic relationship between this particular kind of fungi and the trees.
So basically the trees they. They make sugars and stuff like that, whereas the fungi are really good at leeching out all these other kind of nutrients, and they have a kind of, they're basically swap basically. So the mushrooms give the trees all the kind of nutrients, and the trees give sugar to the mushrooms by comparison.
And as a result of this, that answer meaning that there's a bunch of CO2, which basically is drawn into the trees and then fed to the mushrooms and the figure that they calculated for the amount of extra carbon that's stored in the soil. It was something like 13 gigatons, which is a third of all of the carbon dioxide that's emitted by all us burning fossil fuels.
This was so cool. I had no idea that shrooms were basically doing all that extra work under the ground, and it made me think of you Asim. I thought, yeah, as he would like, he would be proud of his little guys. When heard about this,
Asim Hussain: I am proud of my little guys. Yeah. No, read the same. I don't think I, I didn't listen to the podcast. I actually didn't dig into it that deeply. But yeah, I read the similar article recently and it's mycrorrhizal.
Chris Adams: Thank you. Yeah. Sorry.
Asim Hussain: Yeah. Mycra is a mushroom or the fungal component. Rhiza means root. Basically a type of fungal relationship where it works in symbiotically with roots and yeah, there's a really good book called Wood Wide Web, which talks about this wonderful relationship between trees and just everything around you actually. And they've actually even shown that kind of mushrooms and the kind of, these networks are really large and they actually act as not even
Chris Adams: a network.
Asim Hussain: like a network, but also like a controlling network.
So you can have two competing trees. But, and if one's not surviving well, the microrrhizal network will then force that negotiation for sugar, for carbon in a way that benefits one of the trees, because actually, like a break in the canopy is really harmful for everything. So it's like, there's lots of really intelligent stuff like that happening and um, yeah, it's really, really clever.
Chris Adams: That's kind of cool. I had this idea that they're a bit like the kind of shroomternet, but didn't realize there was this kind of extra, almost what diplomat, I see your diplomat role you're playing as well to get various real entities to play nice with each other. Wow. That's cool.
Asim Hussain: really cool. And there's just li- for the listeners. There's a really interesting guy out there called Paul Stamets, and if you've not heard of him, just look at, look him up on online. And he does some really, he's been in this space for a long time now. He's an amateur mycologist who became famous and he is been really pushing this world and he's got really lots of great TED talks on why mushrooms are all kind of a sustainability solution. It's got books written on it. And if you're actually a fan of Star Trek,
Chris Adams: I knew we were gonna
Asim Hussain: you're even gonna head there. So there's a new Star Trek series called Star, star Trek En, no, hang on. What's it called? It's just called Enterprise actually. And the lead engineer is called Stamets and it's actually based off of this Paul Stamets. And the engine they have is called the Spore Drive, and it can travel to anywhere in the universe through a spore network of stuff. And I just thought it was amazing that even my one, my favorite Star Trek, even that has been mushrooms have integrated even into my favorite sci-fi show, which I just love.
Chris Adams: Oh wow. Is it Enterprise or is it Discovery?
Asim Hussain: DIscovery. Sorry. You're right. It's
Chris Adams: that was the
Asim Hussain: Yeah, yeah, yeah. Enterprise was that other one that didn't.
Chris Adams: Oh my word. We've gone full nerd already. Oh. At least. That's hopefully a kind of entertaining kind of what's segue before we dive into the mailbag. All right. Okay. Should we start with a mailbag then? Asim. See what the first question is.
All right. Okay. I, I watched the recorded World Environment Day kind of panel that you had and we'll share a link to that. So there will be some things that we're not gonna cover because they've already been covered in that recording. Okay. So the first one touches on at LMMs and things like that. So the question as I read here was what computation is needed for ai, this explosive use, and what impact will that have on the environment?
That seems to be the question. Asim, I'll put this to you and then we can have a bit of a, kind of take it and turns on responding with that one actually, if that sounds good to you.
Asim Hussain: Yeah, sure. So I think obviously AI has come up quite a lot even on this podcast. Many times I think the. There, there's been couple of papers. Now there's been one, I forgot what, I think it was about five times. I think that the, we, we'll make sure we quote the, the actual paper in the thing. There's one original one when the, when ChatGPT three came out, which suggested that the energy consumption of a Google Bing chat, uh, search ChatGPT search was five times more. Am I getting that right? Does that sound familiar, Chris? Five times more than, than a normal one, which I'm, now I'm starting to wonder even if that was a, a understatement because actually like. You have multiple chat conversa because there's a few, a further paper which talked about the water consumption, uh, from a chat.
And that one was interesting cause that talked about the whole conversation. Not one question to ChatGPT, but like the series of conversations you have to get to an answer. Then that got to half a liter of water. And I'm not too sure if that original paper was talking about one individual question or the series of questions you've gotta get to to get to your answer, and there's lots of kind of evidence.
I don't, I'm not too sure. I think there's a significant amount of compute being used in LLMs and there's not a lot of transparency on it right now.
Chris Adams: This is actually one of the problems that we do have. Right? So the thing that I might share for you is like you could do a kind of bottom up get based on a look at this by looking at, say what a Nvidia H100 is. Make an assumption of how many might be in a given data center and work out okay if a H100 from Nvidia kind of graphics card or ML card specific pulls this much, then that times this number gives you some kind of number.
And you do see it particularly. Big figures like you see people throwing figures around in the kind of hundreds of megawatts or gigawatts of power, particularly in America right now, for example, you do see figures like this, but I've actually struggled to find particular like specific figures on this cuz.
It's also worth sparing in mind that right now the thing that we could actually point to is some of the existing research that you see from some of the kind of cloud providers right now. There is a company that called Vantage, I believe, and they do cloud billing analysis, and they recently shared some information about, okay, what proportion of spending is actually coming from, say, is being allocated to GPUs versus CPUs and stuff like that.
And I'll share the link in a document for us. But the general argument is that yes, AI is large. I think that the figures on the front page that for their most recent report, shows that you've got something like what you might qualify as AI in this compared to as a share of what people spend on typical cloud like Amazon, elastic Compute Cloud, E C two, and the figures are between.
Six and 8% of all E C two spending is now going towards this, but it's growing extremely quickly. That's the thing that we are seeing for this. And uh, I'll share a link to this cause I suspect there'll be a new version of this clo- cloud cost report, which gives us some numbers about what people are using on billing.
But this is not the same as what kind of forward investments people will be making for this. And one of the problems that we see is that, you basically are running up against the limits of the grid right now because compute, cause data centers are such dense uses of electricity. You have scenarios where a data center full of these kind of cards will use more power than a grid is able to actually deliver to it.
So you're constricting factor right now is the capacity of the grid to feed into the data center more than anything else. And that's one of the key problems that we're seeing come up again and again at the moment.
Asim Hussain: I hear all the arguments and again, like I think there's a real lack of data right now to really get clear answer, but I think I. Then, then you have to look at other kind of economic and proxy arguments. And I think there's one factor which no one can ignore, which is the amount of interest and investment which is heading into LLMs and AIs in this space is outstripped anything that existed before. And that could not be true if this was not a at least perceived as a massive growth opportunity for organizations, which I think that there would be like a knock on effect in kind of emissions and something like that. Whether that's right now is a good point.
Whether it's right now or whether it's like everybody's seeing that this is the future and this is what the putting effort into, which means that this is gonna be a big growth error in the future as well. I think that's this aspect of this, that it's just true and we can't ignore, there's a lot of interest in this space, so I think that's the thing to think about as well.
Chris Adams: So there's one thing that it might be worth bearing in mind, is that I don't think it's related to the fact that in many places, that people are constrained on the supply, being able to actually meet this demand for it. You are seeing a real kind of, crop of new, smaller, much more efficient LLMs being created specifically because there is an interest in being able to not be dependent on either a singular provider of this or just being able to run this on, say, your own hardware, for example.
So I'll share a link to a really nice post by Simon Willison, who's been talking about some of the most recent open models that are designed to basically run on a laptop that can in many cases give you results which are comparable, if not indistinguishable from some of the really expensive LLMs and expensive generative models that you see right now.
So there is a kind of shift for this, and I actually dunno what direction you're gonna see because like you said before, because we don't have access to what kind of percentage the AI is really making up right now in terms of future investment. We don't really know. And like when I've, I've seen what I've done.
I've shared a link in the show notes to. This chart from data center knowledge, it basically gives an idea of projected growth by hyperscalers over the coming few years. And we can see figures of maybe say Google like 3000 megawatts of assumed capacity right now, Microsoft at similar figures around the 2000 megawatt mart.
And same with Google. And you see that there's projections to double that over the coming few years. And I dunno if these are before or after the decisions that people have been making for this, because you've gotta remember that.
Asim Hussain: right.
Chris Adams: We've only just seen in less than a year did you see the Nvidia H1 being released and not just that Now AMD have come out with their own ex, the equivalent, they're competitor to this.
So you now see many more things available. So the question is, where are these gonna go and how are they gonna be powered? Like the worst scenario would be that people end up. Basically setting up data centers and then finding non-G grid ways of generating power to actually run these machines, like using diesel and stuff like that.
I'd really hope that doesn't happen, but I can imagine scenarios doing things like that or I. Possibly opening up some of the existing generation people that have shut down from cryptocurrencies. And since we saw this crypto collapse, and this is one thing that Tamara mentioned, she said it's worth looking at the role that LLMs are placing in the kind of public discourse.
They're very much filling that same role that the Metaverse was supposed to fill in or the other NFTs were supposed to fill maybe a year ago, for example. It's worth saying is that yes, we know that this is being seen as a drive of demand. Whether there are actual numbers that are reliable right now is another matter, and like we can point to various figures for demand and what people are looking at.
But at best, these are all currently like trailing indicators, like the Vantage report, which shows these figures. That's only based on what's already been set up. And that doesn't really tell us a story about are people stuck with buildings full of graphics cards that they can't plug in and sell AI for right now.
Because they can't get this stuff connected to the grid, for example.
Asim Hussain: I just also think that we just can't ignore the fact about the amount of money that is flowing into this space. And I hear the arguments for open source models and my heart really wants them to win out, but when it costs like a hundred million dollars to train up something like GPT three, which I think was the estimate.
And the real benefits come when you like compute even more and more. I just don't know if, if the open source models will win out because obviously people are spending money because there's an advantage to doing so. People wouldn't normally be able to peel off a billion dollars from a company just for no reason.
So it's, I think that there's lots of data we don't have. That's the data we do have, and that's telling us I something at least that there's a lot of,
Chris Adams: Yeah, I guess this is one thing that you currently do not have. Again, it's really hard to get a decent number about where things are going with this, for example, because I think I'm in the kind of camp where actually there is a lot of interesting stuff happening with open models where people are basically defining where they're gonna compete and try and come up with alternatives to this.
Like when you look at some of the purchases being made by companies which aren't just Microsoft and Facebook and Google, right. I think the example of Databricks is a really good example in my view, Databricks. They published a bunch of open data specifically to help build this competing ecosystem, and they recently purchased a company called Mosaic ML.
They published two in openly licensed LLMs. One's called MPT 30 B, and the other one called MPT seven B. These are the ones that you can run on our laptop right now. And they're large, like 19 gigabytes in size and you need a relatively chunky laptop, but you are seeing this and they are, if not comparable, you do see, I think there is, you do see a bit of an arms race right now, and it's interesting where things will go because we've linked to this whole Google has no moat. Sorry. Open AI has no moat and neither do we memo that I shared from Google before, but right now we're not quite sure. I don't know if it's a function of organizations just having loads of access to cheap money and doing this, or people seeing it like, oh, this isn't actually defensive in the long term.
Because now you see all these open models coming out. It might be comparable. It might be the case that if you can just get to the workloads already or get to the workflows, people are not gonna care that much in the same way that you know how loads of us end up using, sorry, loads of people end up using Microsoft Teams, not necessarily because it offers
Asim Hussain: about it recently.
Chris Adams: offers.
The best user experience as someone trying to join a video call. Right. It's more the case that there's a workflow that people have, or there's a way to bundle some of this in. I wonder if that's where some of these network effects are actually more important than essentially the training staff and all and it's, and that's where the levers are more. So, yes, AI is interesting and cool, but it's actually much more about market structure, antitrust and stuff like that. That's probably gonna be more of the drivers perhaps.
Asim Hussain: Oh, interesting. I see. It's not how powerful your model is, it's how you can integrate it into your existing business models take how can mon, how you can monetizing or even don't even monetize it, how you can use it to strategically win versus your competitors. And that might not even necessarily be like anything really to do with the raw power of the model.
Chris Adams: We've gone a bit off the initial question, but Yeah. But yeah,
Asim Hussain: chance, that's, that's a risk you have when you ask an AI question to this, to this podcast.
Chris Adams: Yes to people who with possibly questionable data access to information. Should we move on to questions? So for the next ones, these might be ones which are more solid footings for us. Okay. So the next question was one about basically time and space, location shifting and time shifting when people are talking about the idea of carbon aware software.
And the question is basically regarding location shifting, is the Green Software Foundation concerned that when everyone time shifts, To the same location or to the same greener grids that can increase the demand on the grid's energy, which should increase fossil fuel burning to meet said new demand.
Now, Asim, you might need to unpack this, first of all, for people to understand what happens with the whole merit order for that, because it's not immediately obvious if you're not familiar with grid workings. Why lots of people using computing in one part of the world will lead to more fossil fuels burning other than literally just it's a part of fossil fuels, for example.
Asim Hussain: That would be, let's say, if you had a frictionless capability to move your compute to anywhere in the world at any given moment in time, and you just picked. And every, and everybody in the world had exactly the same capability of frictionally moving their computer. Whatever's the greenness right now, it's probably gonna be France or somewhere in the Nordics.
And then what would happen is, let's say in the next hour, it's France. There's the greenness in the world. Every single bit of compute in the entire world would just move to France. Then those data centers would then, or if they can theoretically handle that load, will then suck up all that green energy, which made them kind of the greenest grids.
And those grids still need to make an energy for the people in France to boil their kettles and do all the other things they need to do. And so all they can do at that moment is burn stuff, typically burn stuff, you know, coal and gas and these things. Those are batteries. Those are chemical batteries. So you, they're the things that you can spin up.
Gas, especially stuff you can spin up very quickly and so effectively that just burns more fossil fuels. And so that's basically, that's what would theoretically, this is very theoretical, that would theoretically happen, um, is if you did, if everybody did just move all their compute over to, let's say France, France would be forced at that moment in time to burn more coal and gas.
Have I explained that correctly? Have I missed something out, Chris, or?
Chris Adams: I think that's about right. The thing it might be worth us talking about or sharing in the show notes is an article I put together called Understanding Energy Trends. At the layer below the internet stack, which talks about this, there is a kind of really nerdy techno concept called like merit order.
With the idea being that different kinds of energy have different costs, so things like solar and wind, once you've installed them, because you are getting the fuel from the sun, you don't have to purchase that extra sun to run it. So essentially the costs are almost free. So that's very low, right? Now there's other things which are designed to work, which are really redesigned, which are again, expensive to install.
But once they're installed, the fuel is relatively cheap for the amount of power they get out. And like nuclear in a good example of this, where you can get loads out that way. Now you have other kinds of fuel like say coal and gas and so on. But broadly speaking, the higher the cost of the fuel you, you have a trade off where things can respond more quickly in response to demand, but they're usually dirtier.
So the idea, that's the kind of the idea behind this, and I think the argument being made here is that if everyone moved all their computing to one part of the world, we would in induce all this extra demand, which could only be met. By things responding to the extra load on the electricity grid, which would usually be met by people spinning up really dirty, gas fired power stations or extra coal or stuff like that.
I think that's the argument and that's essentially the question. Now that we've actually understood, explained the premise, I should ask you what is the kind of official response to this? Let's, is that likely to happen? Is this the thing we should be aware of and how do we respond to that then?
Asim Hussain: My answer to this one is always, have you ever watched the TV series called The Wire? So watch
Chris Adams: Yes.
Asim Hussain: watch The Wire.
Chris Adams: not sure. I'm not sure where we're going, but go with this. All right? Yeah.
Asim Hussain: with this. It's about gangs and police like in New York City, but there was one of the latest seasons, there's a gangster called, I think it's Marlo Stanfield. I remember one of the episodes, like one of his workers is telling him something.
He turns around and goes, that's one of them good problems. And that's what I think about this thing. So someone's telling me a problem and I'm like, this is a good problem to have. If we are ever even remotely getting to the point where demand shifting is affecting a grid, that is a level of achievement, which is excellent.
Yes. Okay. Yes, there are negative consequences to that approach, but we are not even remotely there right now. So worrying about that is I think, a little bit too hyperbolic at the moment. You shouldn't do something because if you take that thing to the absolute extreme, it will be negative, I think is.
What I would say to this argument, I would say demand shifting is never going to be the one solution you have in your pocket to reduce your emissions of your application, your architecture, I always describe it as one of the things that you can do. It's one of the easier things to do. It gets you started on the much more challenging journey of energy efficiency, hardware efficiency, reducing the amount of energy you use, reduce the amount of compute you use.
But it gets you there. And I think that's why a lot of people have been interested in carbon air computing. I always say it's not going to be the solution that solves climate change. It's nowhere near gonna be that solution, but it's a stepping stone on the journey there.
Chris Adams: I think I would've asked that slightly differently actually. Now see, because when I see this question being asked, it's essentially will this demand cause people to do this? I think this has some assumption that people who are basically trying to move computing here, they're doing it because they're looking for kind of greener energy right? Now, I think there is one way that you can solve this purely from a just information point of view, and if you are looking for the lowest carbon intensity and you can see that the carbon intensity is increasing, then
Asim Hussain: I see. Yeah. Yeah.
Chris Adams: you would just choose to not run it there. Right. So this is somewhat dependent on organizations having some of this information published and visible for people, but I think that's actually something that can be done.
And the i e, even in places where this is not public information right now. So for example, in the uk this information is visible on a really clear basis right now, like the UK there is, I think there's a website called carbonintensity.org.uk, which publishes things on a very permissive license for this.
France already has its data available that you can pull out, so if you're gonna do this, then I think you would just look before you deploy something, or you would build some software to check if you're gonna, if you're gonna make it worse. That feels like that would be one of the solutions there, but that feels like a thing that is something that will be made available to people in a number of different ways.
Asim Hussain: You reminded me of a very interesting conversation I had with somebody from Google, cuz it created a service, started with v something and you pumped into that service. What then? I think it was the next day. So the next hour's workloads were going to look like. And then what the next, I'm gonna say tomorrow I think it's gonna be, I think it's day by day? But what tomorrow's carbon intense, the grid's carbon test is going to look like. And you actually pulled lots. You pulled, you didn't just. Didn't just like where's those carbon intensity? Push it there. You actually had a thought. You actually need to run quite a lot and I can't push it all in that one place.
Let's, let me be more intelligent on where I put it. And the idea was thrown out there. What if this service like existed? What if we all collaborated? What if we all said tomorrow, and this is not an unlikely to happen, but what if you all like was so open that we said, do you know what I'm going to submit to this online database that I'm need to run this much compute?
I need to run it tomorrow. I want to run in the greenest region, and it's scheduled it for everybody. And so you run in France, you run in Germany, you run in Norway, and we get it all together. That kind of openness, the data, I think would solve this problem as well, but I don't see anybody, any corporation being that open regarding their workload, future workload.
Chris Adams: The thing that you just described there, Asim, was essentially how energy markets work, right?
Asim Hussain: Oh,
Chris Adams: And they're regulated markets where there is not one owner. You don't, where you, rather than just having only the Amazon Cloud or the Microsoft cloud, you have multiple things, right? So in order for that to be possible, you would need to have a.
Kind of different structure or you need to have people who played a part in actually making sure things can be dispatched to the kind of correct actors in the role you'd need, like a feed in tariff for compute or stuff like that. I feel this is actually quite a nice chance to draw people's attention to a really interesting proposal that I felt from Adrian Cockcroft actually, who a, he's written a piece about why the whole kind of idea of trying to schedule workloads can be counterproductive, which I'm not totally sure. I agree with all of it because I think that whether you actually can see people shifting, creating that demand is another. I'm not sure if we have seen that, but I understand the kind of thrust of his argument.
But the kind of real time carbon footprint standard that he's proposing feels like it'll go a long way to actually providing the numbers that people would actually have access to or need access to realize, am I gonna really doing this? But the other thing you could do is to just actually like price carbon into the cost of cloud, right?
We already have spot markets. If you had in your spot markets, there was a spot plus, which basically had the cost of carbon at say a hundred Euros, a ton or a hundred dollars a ton, and you looked at that, then that would actually be a really, in my view, a fairly simple way to make sure you're not shifting all the compute to the worst places.
For example, like again, like if you're gonna go like totally neoliberal and price based, then yeah, that's how you can do this stuff. But the other thing you can do is literally just go totally whacky, just have massive batteries and data centers the way people look like they're doing for other significant drawers of electricity.
So for example, if you're looking at say, electric car charging stations, lots of them now have lots and lots of onsite storage to deal with the fact that car driving. It's really spiky, so most of the time it's not being used, but then people come to it and they need to pour a huge amount of power very quickly.
Likewise, like high-end induction hobs, which have their own batteries inside it, to again, deal with this big spike in use. There's lots of kind of strategies you can use, which don't meet mean that you don't need to actually burn fossil fuels for this. It does involve nerding out about the grid, and that's one thing which is new to a lot of actual people who are working with this.
So to an extent, I don't see it in the same way that you might see it. For example, Asim Cause I feel this is actually one that can be addressed using various techniques, people using other sectors which aren't technology, for example.
Asim Hussain: I will just say it's probably clouded from having many conversations with people about carbon aware computing being shut down because of versions of this question. Yes, car- we like the, but if it was to be taken to the extreme, then it would destroy our whole infrastructure, which I think for me, I've had this question and in phrase as a response to me as a reason for why we can't even entertain looking at carbon aware computing.
So that's where I, you've probably triggered my default defense mechanism against this question, but I, there's, I think there's a very nuanced, very important topic. I think it's important for you and me to have different opinions cause that's how we get or share all this important knowledge, uh, with the world.
Yeah.
Chris Adams: All right. Oh, there's one thing I should actually go add a bit of a kind of plug for. So there's some really interesting work from, I think it's Abel Souza and Noman Bashir. They've been publishing some fantastic papers talking about specifically the likely impacts of carbon air computing, what the savings could possibly be.
And it is a kind of quite a technical paper. Actually, when I read, I was like, oh wow, there's a lot of numbers in these charts. But it's really good. I found it one of the most useful ones for informing my opinion about where this goes, and I'll make sure that we share some links to this cause I think it'll actually add some extra nuance to this conversation.
The other thing to bear in mind is that if you just have this, then. Literally, it's not like large Hyperscale companies are not making enough money to buy batteries, right? If they're able to spend 70 billion on share buybacks in a given year, they can probably afford that. Buy is literally hundreds of megawatts of extra capacity.
You could just have five or six hours of local battery storage so that you wouldn't even need to touch the grid. You just run it locally if you want it to be sure that your power is green, but that's a separate discussion. So I'm just gonna park that once again cuz this is a, I'm worried about sound like a bit of a broken record on this one.
Asim Hussain: You do. I don't know where all this money is cause it doesn't find itself into my, into my budgets. But yeah.
Chris Adams: There was a report from the Rocky Mountain Institute who were doing some analysis on green Bitcoin and things, and they'd said, with 115 billion US dollars, you could buy up every single coal fired power plant on Earth and replace it with renewables and.
Asim Hussain: 115 billion?
Chris Adams: 115. Yeah. Last year, the combined share buyback, so the money made by Apple, Google, Amazon, and Microsoft as in they had so much money that they just thought, oh, I'm just gonna buy my own shares.
That was more than $125 billion. So single handedly. That could solve it in a single year, but we've decided to spend it on, do you know the problem with climate change? Shareholders aren't getting enough money back. I feel that this is the thing that we need to be talking about. If when we're talking about green software, it's like where is this money going?
Cuz we clearly have the money for this. It's just a case of priorities and we could be moving faster if we really wanted to. But that's again, I'll stop now because I'm a little bit ranty. I'm a bit worried. Sorry about that.
Asim Hussain: I want, I want everybody to know that the entire time I worked for Microsoft Chris. Adams would always type whenever he is talking to me on chat, would always type m dollar sign, always m dollar sign for Microsoft. But I dunno why you don't do, I'm an intel now I suppose. I suppose there's no, there's, could you euro you could do in Intel, but I don't know, I dunno.
Maybe there's another, maybe there's a reason why you don't put Euro, a euro signal into there for Intel. But anyway.
Chris Adams: think it's, cuz this was actually something that I, when I used to read The Register, when I used to, when I first came into technology and they used to call, I think IBM was called Big Blue and it was, there was another one with the beast from somewhere. And uh, yeah, Microsoft wears M dollar. Like
Asim Hussain: What was it? So
Chris Adams: because they made so much cash.
Yeah. It wasn't me being smart, it was like, oh, total second hand whim, mate. Yeah. They are very effective at basically
Asim Hussain: very, they, yeah. They've
Chris Adams: shed-loads of money. Absolutely.
Asim Hussain: yeah. That's what com, that's what corporations are there for.
Chris Adams: Do you know what? In India, right, I didn't realize this, but in India, every large company as a condition of working inside India, which most populous nation in on earth, something like 20% of all the profits of all have to be like by law in invested in what India considers like priority areas. So specifically into renewables straight away, right?
So there are all these mechanisms that people actually do that mean that we can direct funding to places to speed up action on climate. And if we're talking about technology and talking about carbon awareness and stuff like this, then we really need to be prepared to think about and have conversations about how much in the way of resources do we really wanna allocate towards what the science is spelling out and how much do we need to make sure that share price goes up, because yeah, okay. It's nice that people have like pensions and things and all that, but also it would be nice to have a livable world and having just this much money available feels, come on, let's like get this sorted.
Literally one year would solve it, but that's another discussion. Anyway. Let's move to the next question because I think people listen to, if they're trying to here to learn about code, not about economics.
Asim Hussain: we've turned into a, we started off with opinions on ai. We turned into an energy podcast, and now we're talking about capitalism. So let's just go, let's turn into a politics podcast. Let's just do it.
Chris Adams: I guess it's everywhere. Let's move to, okay, next question.
Asim Hussain: next question.
Chris Adams: Okay. The question, this was one about this idea of some machines and some software running in a low carbon mode, and the question basically came out saying, why not just run low carbon mode all the time? Not just when the carbon intensity is high on dirty electricity for bits of software.
This is essentially one of the questions that came from this. And I think, uh, this might be a reference to like things like Branch Magazine or even with your CarbonHack thing. One of the winning designs was, uh, software kit that would show different kinds of versions depending on how dirty the electricity was to stay inside a carbon budget.
Asim over to you.
Asim Hussain: It's an interesting question. Well, a, the user, I presume I, the way I've always imagined and the way you, in fact, you can use Branch Magazine. You can just go on Branch Magazine and say low, assume it's high. I dunno how the terminology, sorry, but low carbon mode all the time. I've always imagined these kind of UI modes in your system as something you can select if you wanted to or something we should auto select based upon that aspect of how, how carbon it is. So I, I've always imagined it's, it is user driven and I suppose if you've got a product and they, and you force yours into low carbon and the competitors doesn't, and all your users move over to competitors, then you've gotta factor that in as well.
But I also think, Chris, is there like an argument here about, does that stop money from going into renew? I don't think it does. I think the money will go to renewables.
Chris Adams: No, this basically, this isn't really about, uh, I don't think this
Asim Hussain: Yeah. Yeah. I don't think it was finance. Yeah. Yeah. Okay. Yeah.
Chris Adams: Okay, so the way that we designed this in Branch Magazine, cuz we were like playing around with this idea, we basically made it a thing that was user definable so they could choose to override this, but we would set a default to kind of. It was as much an education piece as anything else because for a lot of the time people aren't even aware that most of the time you don't even think about where the power comes from. So the idea that this is being foregrounded and the materiality is being exposed to you was the new idea for us. That was why we were doing it, to really emphasize this, because we thought this is a nice way to park back to some of the ideals of threat being something that's supposed to be open for everyone accessible and everything like that. So we figured if you design a low carbon mode, that kind of emphasizes the fact that the grid changes, but also at the same time emphasizes the fact that when you're using something, it should be accessible for people who cannot be who, who may be partially sighted or stuff like that.
Then you can embed some of these other values in how you build things to communicate different kind of sensibility. So I think the general answer is a lot of the time people do quite enjoy having quite rich experiences and having a kind of sober or monkish experience all the time might not be particularly compelling for lots of people.
And I, I think that's okay to actually be explicit about some of that right? You, I don't think it's realistic to think everyone only you ever wants to see some of this stuff or even make all those decisions for someone else. I think that might be a little bit too paternalistic, but that was my kind of take for it.
But there are things you could do to kinda hide this. You could possibly design it so that. When you build something, there's certain things, there are ways to provide a rich experience whilst reducing the kind of resource impact in the same way that you can refactor code whilst reducing the amount of computation it needs to consume to do something.
And I think that's the thing for it. But I figure like the thing you should probably do is just put it into the user agent or if you're look at, using a browser so you can have like a user agent of change where they decide this stuff themselves. You already have, do not track low bandwidth please. Stuff I think, stuff like that I think would be cool, but we don't have any browsers doing that yet, but this may be early days.
Asim Hussain: Yeah, yeah. Yeah.
Chris Adams: All right, question four. This question is about Jevons Paradox. All right, so this basically says the question is given Jevons paradox insight number four in the state of green software report, how do we change the thought pattern that more is more in technology?
Now, it might be worth just briefly explaining what Jevons Paradox is. Before we dive into this question, Asim, I could probably do it, have a go at talking about Jevons Paradox, if you wanna get ready for answering this particularly thorny question. Should I do that?
Asim Hussain: you do it? Yeah. You do a better explanation of it.
Chris Adams: Okay, so first of all, I'm gonna point people to the fact that this is the fourth insight in the report to buy some time.
But basically, Jevons Paradox is a name given to the phenomena where when you increase the re the efficiency of a particular resource, using any resource, you can increase the absolute use, even though individually it's more efficient. So this initially came from hundreds of years ago when William Stanley Jones noticed that making coal fired steam engines more efficient meant that more people used coal fired steam engines in new places, which would lead to an increase in the absolute use of coal.
And he was so worried about this that he thought we would run out of coal. So he started writing all these papers about, please could we not do this? This is terrifying. We're we are gonna stop progress if we make everything too efficient and hundreds of years later. This kind of applies with things like cloud computing and stuff like that, or it's often used as a way to say, you can't just talk about efficiency, you need to talk about absolute figures.
So if you make something more efficient, you just result in more use. And the common example is cloud. So by making hyperscalers talk about cloud being much more efficient. But the flip side of that is because it's suddenly more efficient, more and more people have access to it, which increases the absolute usage of this.
And we have seen absolute increases in just kind of technology and compute use. And I think that's one of the things that. Is what's inspiring this. But you also see it in things like ride sharing and stuff. There are examples of Uber and Lyft and other kinds of companies. When you make it really easy to hail a ride, you result in more people driving.
You increase the miles driven in a given city because it's so much more convenient. It's also subsidized by venture capital as well, which makes it e uh, which makes it cheaper than other options, but that's one of the impacts you have. But broadly speaking, making things more efficient is said to have a kind of rebound, which can increase the total use.
So that's it.
Asim Hussain: Yeah, I just think it's, part of it is really just, it's about resource constraint. You're, cause we were talking about earlier on, weren't we? I remember what the context was, but we were talking about if you're resource constrained, you have to make different decisions and you don't use that resource as much just cause it's just, I think it was AI, just cuz it's not there.
And so if you're resource constrained, if there's only 10 of something in the world, you'll just make choices that only use 10 of something in the world. But then if you make it 10 times more efficient, you'll still use all of the resources that you had and just use more of it. So I think that the argument here is as you're making things more efficient, that natural resource constraint, which was forcing you to make these trade offs and be not wasteful, at the very least, disappear.
Then you can just, you just start being wasteful. So I think the solution here is there has to be a constraint some way of, it's not, you're not gonna stop. I don't think we're gonna stop Jevons power. We don't think there's any way we can really force the world not to make things more efficient, just because that's what we're just absolutely engineered to do.
But what we need to do is to enact constraints, whether they're artificial, whether they're regulatory, whether they're some other aspect of it. We need to enforce that constraint and that's how we do it. Like for instance, when organizations set carbon targets and another kind of like targets to achieve.
Right now we have this kind of, it's okay, you can, we just, we'll just carry on increasing. It's okay for now, but there needs to be a real extra constraint, which forces you into those actions is, I think that's the only way. That we're really going to deal with this cause I, I see it right now, even with ai, like as AI gets more and more efficient, we're just gonna use it more and more to solve problems.
Inefficiently, but to, but conveniently. Yeah. Yeah. That's my answer.
Chris Adams: Okay.
Asim Hussain: Do I win?
Chris Adams: I think so I'm gonna ho, I'm gonna wait for the jury to be out and we'll put that to the listeners. That's what we should do. So basically, I think the main thing I'm getting from what you said there is that you do need to be prepared to talk about absolute figures here, and that's one of the key things, so
Asim Hussain: Well, I wouldn't, I wouldn't necessarily agree with absolutes. I would just say there needs to be another for, or not even just one, multiple other forcing functions to force your usage down, whatever that is. I don't know what it is.
Chris Adams: Oh, okay. I guess this is a little bit like when people talk about carbon budgets on websites or carbon budgets on services you've got, that's a decision that people have made to go for that. I have a bit of a struggle with this term because when people talk about Jevons paradox, it's often used as a kind of way to say it doesn't matter that you're, you are talking about efficiency because you, you are just gonna make it back.
And there is a kind of subtext which basically says, why are you even trying? It does feel a bit kind of "okay doomer" and if we look at the last say, couple of decades, yeah, we have seen. If we follow like the IEA, the International Energy Agency, what the energy people who look at how much power is being used by stuff.
They basically say that over the last, say 10 years or so, we've seen, we've seen a massive increase in the use of computing. But if you just look at the energy usage, we have not seen the corresponding at the same increase in the energy being used so that we use more computing. But the energy's been more or less level. Now, you can take up issues with those numbers because when you look at numbers that include, say China and stuff, like the numbers look quite a bit higher than what were from the IEA, but that's not peer reviewed and we can't really use those numbers yet. So they, there is some contention there, but I feel this also just ignores the fact that people have been moving faster than Jevons Paradox to keep things better than they otherwise would've been.
And I feel like when this is rolled out, a lot of the time it's not rolled out in a way that says, is there a 10% impact? Is it, do you get 10% of rebound? So if you're doing 20% of savings, there's a net saving here. And like without these kinds of numbers, I think it's actually, if it ends up being quite an academic and difficult thing to engage with and like, I think that's one of the struggles I have when we talk about some of this, because it's often used to either disincentivize people trying to make like honest and effective changes in the efficiency of stuff, or it's being used to, I know almost as a bit of a gotcha to say it's still so it's still doing this and I feel like, oh, congratulations. Oh, I'm really glad you told me that. We should be thinking in absolute terms about the climate, right?
I'm like, okay, yeah, surely we've established this years ago. So that's the thing. But this doesn't actually answer the question of how do we change the thought pattern that more is more in tech? I suspect what you said Asim was this idea like, I don't know. My assumption would be that you do need to actually be prepared to think about absolute figures and how do you stay inside those and you consider those a constraint in the same way that you might design something that has to be accessible. You say it has to be staying inside these kind of targets that need to be improving each year. That's what the ITU, which is the International Telecommunications Union and the Science-Based Targets Initiative. All these folks are basically saying, Yeah, the absolute carbon emissions of the ICD sector has to be halved by 2030.
And I think that's one of the things you might need to do is actually have some narrative that says, oh yeah, we're gonna halve our emissions by this much, and then how do we fit that into how we work outside of technology? There are ways to talk about some of this, cuz this is essentially, this is a little bit edging into the whole discussion about do you need to, is it growth first or is it, do we have to, do we target growth so we can have nice things or can we just aim for nice things automatically, directly.
Yeah, because there's things like, yeah, donut, economic, there is like typical kind of economic thinking, which is the thing we need to do is get really rich. And we might poison ourselves along the way and endure all this damage, but because we're so rich, we can then undo all that damage and somehow, I'm not sure that somehow unmake the extinct, all these things which we made extinct along the way.
That's one of the arguments around growth. So you have enough wealth that should pay for things, but schools have thought like donut economics and so on, and they basically say there's a social foundation of things everyone should have access to. There's certain kind of overshoot and like we can target making sure that everyone has enough of a social foundation.
By while staying inside this zone, if we just target that stuff first, if we think about the things you want to have immediately, so rather than just focusing on get rich first, we can focus on, let's make sure everyone has access to shelter, has access to healthcare, stuff like that. But again, it's like using needs is what you might think about it.
Like what user needs can you meet here that you're targeting first, rather than trying to grow for something large there but,
Asim Hussain: Yeah, I agree with you, but I just think it falls into that same bucket of arguing, which is, here's a bunch of ways the world could be better, and if we did it this way, wouldn't it be amazing? We would just have all these issues would be solved, and I'm always like nodding my head and going, that sounds beautiful, Chris.
I would love that world to exist. I just have. Absolutely no idea how it's even remotely possible to get there, given the way the current engine works. Actually, the only way, I'll tell you, the only way I think this will ever work, and this is the only way, and I think I, and I do think this is not gonna be something that happens by 2030.
I think this is something maybe our children might be able to, we'll be dead. We'll be dead by the time this, this can change. We're talking about like changing the Overton window, talking about changing the Overton window, like we are talking about A cultural change that will take generations is I think what it would really take.
I don't think this is gonna happen in our lifetimes. I think the world that you've described is a beautiful world. My dream, it will exist and I think it'll only exist if the culture changes and the culture changes worldwide and dramatically, and it'll only change if every generation comes along and like just shifts it a little bit and say, I'm of the generation, it was all the money for us, so everybody just wanted money.
It was like, what job gets you the most money? Okay, let's take that job. And I think hopefully I'm gonna raise my children to be a little bit different, to think different, to think what is the most positive impact you can have on this role? How can you be a better custodian of the planet? I think these kind of changes to drive, I think it's gonna take generations.
I really do. And I think it's something we actually do have to do,
Chris Adams: Or hand power to people who are not optimizing for the things that we, you and me might have been optimizing for, perhaps. Yeah. Okay. Okay. Christ, I've gone into politics again.
Asim Hussain: Yeah. We've done it again!
Chris Adams: All right. All right. This is our last question in the mailbag, and I think we might have to have another episode to answer some of these questions if there's interest.
We don't know if there will be. So are there any notable examples of organizations or projects that have successfully implemented green software practices? What can we learn from them? That's the question. And Asim, I'll put this one to you cuz I suspect you've had a few conversations with people doing some of this.
Asim Hussain: Yeah. I think that in terms of things that have been published and have got large enough to have significant impact, there's some of the stuff that happened at Microsoft, so the Carbon Aware Windows is I think one of the bigger implementations of carbon aware computing and I would, now that's a direct line between carbon aware windows to carbon aware Xbox.
I can draw that line for, oh, we need, still need to get those people onto the podcast if we can. So I think those are really great examples. And I, and I was actually Scott Chamberlain, who's my lead now at Intel, he's the one that was driving a lot of the carbon aware. Windows work for ages until he got that out.
And so that's like Windows now does carbon aware updates. I know sounds small, but updates like the type of workloads are very carbon aware and shiftable. So I think that's a really good example. Again, like very similar carbon aware work from Google, like earlier on. That was almost like two, three years ago now.
They did similar stuff with their data center workloads. I think the rest of the work that's happening in this space, there's a lot of work still happening right now in the measurement space. I know there's smaller bits of consultancy work that's going on with the larger companies. I don't know the specifics of, I don't think there's anything that we could re talk about publicly and also that would wow people, cause it's just the, the guts of people, organizations, and the work they do.
But I think to me, those are always two of the big wins in use cases. And it's always interesting me that those are both carbon aware examples because the investment you need to implement carbon awareness on the scale of which you could reach with it in a very short period of time is very impressive.
Chris Adams: Okay. Alright. Thank you Asim. Alright, so that's carbon aware programming. So there are examples that you can point to using carbon aware programming that create measurable figures from this. And that's like ones which are not particularly latency aware, but are convenient that happened in the background, which don't result in a kind of poor user experience, but still deliver some carbon savings, yeah?
Asim Hussain: The one thing I'll add to that I think is that what will drive more of the work in energy efficiency and kind of hardware efficiency in all these other spaces, I think is measurement. I think is ubiquitous, making these things very easy to function. I think though, I think the work that you've done with Co2.js And really making the ability to measure this stuff very, very easy is what's gonna drive a lot of the kind of the next generation of changes, and it probably already has, I just dunno any of the success stories there yet.
There's probably the loads of websites. Yeah. But,
Chris Adams: So actually this is a nice way to talk about, or a nice segue for what I think is interesting and there's, there's some work by company called Sentry Computing. So a talk at Grafana Con about basically how us tracking all the metrics for compute usage meant that we were able to reduce our own usage by X percent.
That was a cool thing in my view. But the reason I'm talking about this is not only has he shared some of that stuff, But he also ended up proposing a kind of proposed, a new HTTP response header specifically to the IETF as an RFC to basically say, this is how HTTP should work. We should bundle these numbers into HTTP so that there is a header for like carbon emissions, like Scope two or something like that.
Now, whether that's the correct number to use is another matter. But I think that's one of the examples of focusing on the efficiency part and the kind of resource usage and energy level. If we use the Green Software Foundation way of thinking about this, which is carbon, uh, and it's hardware efficiency, energy efficiency, those are the things I, I think the ones that are worth looking at.
There's also this other thing that, I dunno if the numbers are big yet, and I would love to get a second opinion, but there's a company called Storj. Which is S T O R J. They are basically making claims about massively reducing the hardware footprint of providing object storage, like S3 style object storage by using loads and loads of unused capacity in data center storage.
Just the same way that Airbnb can use unused capacity in houses and hotels and things like that. And they're basically doing this.
Asim Hussain: unused if they're using it?
Chris Adams: So the idea is that let's say you've got a bun, a loads of service, and they have free space on hard discs, which aren't being used by anyone right now. So they do that and they use this technique called erasure encoding.
So you take it to file, you split it up enough points so that you don't have to replicate the same file like five times if you just replicate enough of the overlapping shards of it. And are they, I'll share the link for that. Cause I think it's interesting. I don't know enough about the, what kind of peer review have you seen for the numbers, but I think it's extremely clever and it's one of the few examples I've seen of people doing something on the hardware efficiency part of green software that I think is cool and it ends up being sub substantially cheaper than using object storage from some of the big providers, for example. So it's, we're talking like around 20% of the price and the figures they say is it's maybe 70% lower carbon footprint for storing a terabyte of data over a year compared to some of the big providers or a data center.
So that's the stuff that I think is interesting, but I need to caveat that with that. I don't have any independent verification of that stuff yet, even though I think it's super duper cool. All right.
Asim Hussain: Chris Adams, always a scientist looking for peer, reviewing his statements; Asim Hussain always like shoots from the hip, whatever stat comes to his mind. There you go.
Chris Adams: Alright, Asim, I think this has taken us till the end of our time. We have allotted for our mailbag episode. I really enjoyed this. So thank you very much for coming on and, uh, we should probably wrap it up and say thank you to everyone for listening and we'll have more of the regular programming with interviews with for more experts coming up in the coming weeks.
All right.
Asim Hussain: Alright,
Chris Adams: Thanks mate. Take care of yourself. Have a lovely week and everyone enjoy. For those of you who do celebrate it in America, happy free from us Brits tomorrow here.
Asim Hussain: Commiserations to the Brits. Go back.
Chris Adams: Alright, leave it like that. Torah.
Hey everyone. Thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.
To find out more about the Green Software Foundation, please visit https://greensoftware.foundation That's Green Software Foundation in any browser. Thanks again and see you in the next episode.

  continue reading

86 episoder

Artwork
iconDel
 
Manage episode 367903804 series 3336430
Innhold levert av Asim Hussain and Green Software Foundation. Alt podcastinnhold, inkludert episoder, grafikk og podcastbeskrivelser, lastes opp og leveres direkte av Asim Hussain and Green Software Foundation eller deres podcastplattformpartner. Hvis du tror at noen bruker det opphavsrettsbeskyttede verket ditt uten din tillatelse, kan du følge prosessen skissert her https://no.player.fm/legal.
On this episode of Environment Variables, host Chris Adams is joined by Asim Hussain as they dive into a mailbag session, bringing you the most burning unanswered questions from the recent live virtual event on World Environment Day that was hosted by the Green Software Foundation on June 5 2023. Asim and Chris will tackle your questions on the environmental impact of AI computation, the challenges of location shifting, the importance of low-carbon modes, and how to shift the tech mindset from "more is more" (Jevons Paradox). Chock-full of stories about projects implementing green software practices, and valuable resources, listen now to have your thirst for curiosity quenched!
Learn more about our people:

Find out more about the GSF:

Questions in the show:
  • What computation is needed for AI, the explosive use, and what impact will that have on the environment? [7:17]
  • Regarding location shifting - is the foundation concerned that when everyone time shifts to the same location or the same greener grids, that can increase the demand of those grid's energy, which could increase fossil fuel burning to meet said new demand? [18:50]
  • Why not just run low-carbon mode all the time, not just when the carbon intensity is high on “dirty electricity”? [34:35]
  • Given the Jevons’ Paradox, how do we change the thought pattern that more is more in tech? [38:15]
  • Are there any notable examples of organizations or projects that have successfully implemented green software practices? What can we learn from them? [49:00]

Resources:

If you enjoyed this episode then please either:


TRANSCRIPT BELOW:
Asim Hussain: We're talking about a cultural change that will take generations is I think what it would really take. I don't think this is gonna happen in our lifetimes. I think the world that you've described is a beautiful world. I hope to, I hope my dream it will exist, and I think it'll only exist if the culture changes and the culture changes worldwide and dramatically.
Chris Adams: Hello, and welcome to Environment Variables brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software. I'm your host, Chris Adams.
Welcome to a special mailbag episode of Environment Variables. We're thrilled to bring you the most anticipated questions that arose during a recent live virtual event hosted by the Green Software Foundation on World Environment Day on the 5th of June with over 200 passionate practitioners participating from all around the world, the event featured an expert panel consisting of influential voices in the green software movement.
Our panelists at the time included Asim Hussain, who's here today. Hey, Asim.
Asim Hussain: Heya.
Chris Adams: We also had Anne Currie, the Greentech advocate at Container Solutions and Community Chair at the GSF. We also had Tamara Kneese, the UX research leader and strategist and lead researcher of the State of Green Software Report, as well as Pindy Bhullar, the ESG Chief Tech Technology Officer at the bank, UBS and the PhD in her own right. During the event, they introduced the Green Software Foundation, and unveiled all kinds of insights from the recently published state of Green software report, which sparked all kinds of engaging discussions. But we didn't really have enough time to cover all the questions that were coming in from the people who are asking there.
So you can think of today as a bit of a kind of roundup of some of the questions that seemed particularly interesting and felt like they probably need a bit more time to actually delve into them properly. So that's the plan for today. We're gonna look into some of the questions that we didn't have time to answer or I wasn't actually there so I didn't have a time to answer, but that's what we're doing.
So today it's myself, Chris Adams of the policy chair at the Green Software Foundation, and I'm joined by Asim Hussein, the chair and executive director of the Green Software Foundation and lover and grower of mushrooms. And Asim. I'll live some space for you if you wanna talk about anything in particular, cuz I have some interesting mushroomy factoids I'll share with you after this.
Asim Hussain: It's just the way you said lover of mushrooms was a little bit, um, risque. But yeah, I only love them cuz I like to eat them. But yeah, labro, I, I am a mycologist. I grow mushrooms and currently actually very desperately trying to rescue two bags of a lion's mane mushrooms, which I think I've let rot in our bags a bit too long.
So hopefully I'll have some lion's manes mushrooms this time next week.
Chris Adams: That sounds cool. Can I share my fact about mycelium, which I think is really cool. So I listened to a podcast called Catalyst with Shayle Kann, and he recently did an episode where he was talking to one of these researchers in South Africa who published some research about what I referred to as Microrisal-
uh, The fungus. Basically it's the kind of fungus that you can think of being attached or interfacing with the roots of a tree. And they did some research for the first time to get an idea of how much CO2, how much carbon is actually sequestered by this. Cuz there is a kind of symbiotic relationship between this particular kind of fungi and the trees.
So basically the trees they. They make sugars and stuff like that, whereas the fungi are really good at leeching out all these other kind of nutrients, and they have a kind of, they're basically swap basically. So the mushrooms give the trees all the kind of nutrients, and the trees give sugar to the mushrooms by comparison.
And as a result of this, that answer meaning that there's a bunch of CO2, which basically is drawn into the trees and then fed to the mushrooms and the figure that they calculated for the amount of extra carbon that's stored in the soil. It was something like 13 gigatons, which is a third of all of the carbon dioxide that's emitted by all us burning fossil fuels.
This was so cool. I had no idea that shrooms were basically doing all that extra work under the ground, and it made me think of you Asim. I thought, yeah, as he would like, he would be proud of his little guys. When heard about this,
Asim Hussain: I am proud of my little guys. Yeah. No, read the same. I don't think I, I didn't listen to the podcast. I actually didn't dig into it that deeply. But yeah, I read the similar article recently and it's mycrorrhizal.
Chris Adams: Thank you. Yeah. Sorry.
Asim Hussain: Yeah. Mycra is a mushroom or the fungal component. Rhiza means root. Basically a type of fungal relationship where it works in symbiotically with roots and yeah, there's a really good book called Wood Wide Web, which talks about this wonderful relationship between trees and just everything around you actually. And they've actually even shown that kind of mushrooms and the kind of, these networks are really large and they actually act as not even
Chris Adams: a network.
Asim Hussain: like a network, but also like a controlling network.
So you can have two competing trees. But, and if one's not surviving well, the microrrhizal network will then force that negotiation for sugar, for carbon in a way that benefits one of the trees, because actually, like a break in the canopy is really harmful for everything. So it's like, there's lots of really intelligent stuff like that happening and um, yeah, it's really, really clever.
Chris Adams: That's kind of cool. I had this idea that they're a bit like the kind of shroomternet, but didn't realize there was this kind of extra, almost what diplomat, I see your diplomat role you're playing as well to get various real entities to play nice with each other. Wow. That's cool.
Asim Hussain: really cool. And there's just li- for the listeners. There's a really interesting guy out there called Paul Stamets, and if you've not heard of him, just look at, look him up on online. And he does some really, he's been in this space for a long time now. He's an amateur mycologist who became famous and he is been really pushing this world and he's got really lots of great TED talks on why mushrooms are all kind of a sustainability solution. It's got books written on it. And if you're actually a fan of Star Trek,
Chris Adams: I knew we were gonna
Asim Hussain: you're even gonna head there. So there's a new Star Trek series called Star, star Trek En, no, hang on. What's it called? It's just called Enterprise actually. And the lead engineer is called Stamets and it's actually based off of this Paul Stamets. And the engine they have is called the Spore Drive, and it can travel to anywhere in the universe through a spore network of stuff. And I just thought it was amazing that even my one, my favorite Star Trek, even that has been mushrooms have integrated even into my favorite sci-fi show, which I just love.
Chris Adams: Oh wow. Is it Enterprise or is it Discovery?
Asim Hussain: DIscovery. Sorry. You're right. It's
Chris Adams: that was the
Asim Hussain: Yeah, yeah, yeah. Enterprise was that other one that didn't.
Chris Adams: Oh my word. We've gone full nerd already. Oh. At least. That's hopefully a kind of entertaining kind of what's segue before we dive into the mailbag. All right. Okay. Should we start with a mailbag then? Asim. See what the first question is.
All right. Okay. I, I watched the recorded World Environment Day kind of panel that you had and we'll share a link to that. So there will be some things that we're not gonna cover because they've already been covered in that recording. Okay. So the first one touches on at LMMs and things like that. So the question as I read here was what computation is needed for ai, this explosive use, and what impact will that have on the environment?
That seems to be the question. Asim, I'll put this to you and then we can have a bit of a, kind of take it and turns on responding with that one actually, if that sounds good to you.
Asim Hussain: Yeah, sure. So I think obviously AI has come up quite a lot even on this podcast. Many times I think the. There, there's been couple of papers. Now there's been one, I forgot what, I think it was about five times. I think that the, we, we'll make sure we quote the, the actual paper in the thing. There's one original one when the, when ChatGPT three came out, which suggested that the energy consumption of a Google Bing chat, uh, search ChatGPT search was five times more. Am I getting that right? Does that sound familiar, Chris? Five times more than, than a normal one, which I'm, now I'm starting to wonder even if that was a, a understatement because actually like. You have multiple chat conversa because there's a few, a further paper which talked about the water consumption, uh, from a chat.
And that one was interesting cause that talked about the whole conversation. Not one question to ChatGPT, but like the series of conversations you have to get to an answer. Then that got to half a liter of water. And I'm not too sure if that original paper was talking about one individual question or the series of questions you've gotta get to to get to your answer, and there's lots of kind of evidence.
I don't, I'm not too sure. I think there's a significant amount of compute being used in LLMs and there's not a lot of transparency on it right now.
Chris Adams: This is actually one of the problems that we do have. Right? So the thing that I might share for you is like you could do a kind of bottom up get based on a look at this by looking at, say what a Nvidia H100 is. Make an assumption of how many might be in a given data center and work out okay if a H100 from Nvidia kind of graphics card or ML card specific pulls this much, then that times this number gives you some kind of number.
And you do see it particularly. Big figures like you see people throwing figures around in the kind of hundreds of megawatts or gigawatts of power, particularly in America right now, for example, you do see figures like this, but I've actually struggled to find particular like specific figures on this cuz.
It's also worth sparing in mind that right now the thing that we could actually point to is some of the existing research that you see from some of the kind of cloud providers right now. There is a company that called Vantage, I believe, and they do cloud billing analysis, and they recently shared some information about, okay, what proportion of spending is actually coming from, say, is being allocated to GPUs versus CPUs and stuff like that.
And I'll share the link in a document for us. But the general argument is that yes, AI is large. I think that the figures on the front page that for their most recent report, shows that you've got something like what you might qualify as AI in this compared to as a share of what people spend on typical cloud like Amazon, elastic Compute Cloud, E C two, and the figures are between.
Six and 8% of all E C two spending is now going towards this, but it's growing extremely quickly. That's the thing that we are seeing for this. And uh, I'll share a link to this cause I suspect there'll be a new version of this clo- cloud cost report, which gives us some numbers about what people are using on billing.
But this is not the same as what kind of forward investments people will be making for this. And one of the problems that we see is that, you basically are running up against the limits of the grid right now because compute, cause data centers are such dense uses of electricity. You have scenarios where a data center full of these kind of cards will use more power than a grid is able to actually deliver to it.
So you're constricting factor right now is the capacity of the grid to feed into the data center more than anything else. And that's one of the key problems that we're seeing come up again and again at the moment.
Asim Hussain: I hear all the arguments and again, like I think there's a real lack of data right now to really get clear answer, but I think I. Then, then you have to look at other kind of economic and proxy arguments. And I think there's one factor which no one can ignore, which is the amount of interest and investment which is heading into LLMs and AIs in this space is outstripped anything that existed before. And that could not be true if this was not a at least perceived as a massive growth opportunity for organizations, which I think that there would be like a knock on effect in kind of emissions and something like that. Whether that's right now is a good point.
Whether it's right now or whether it's like everybody's seeing that this is the future and this is what the putting effort into, which means that this is gonna be a big growth error in the future as well. I think that's this aspect of this, that it's just true and we can't ignore, there's a lot of interest in this space, so I think that's the thing to think about as well.
Chris Adams: So there's one thing that it might be worth bearing in mind, is that I don't think it's related to the fact that in many places, that people are constrained on the supply, being able to actually meet this demand for it. You are seeing a real kind of, crop of new, smaller, much more efficient LLMs being created specifically because there is an interest in being able to not be dependent on either a singular provider of this or just being able to run this on, say, your own hardware, for example.
So I'll share a link to a really nice post by Simon Willison, who's been talking about some of the most recent open models that are designed to basically run on a laptop that can in many cases give you results which are comparable, if not indistinguishable from some of the really expensive LLMs and expensive generative models that you see right now.
So there is a kind of shift for this, and I actually dunno what direction you're gonna see because like you said before, because we don't have access to what kind of percentage the AI is really making up right now in terms of future investment. We don't really know. And like when I've, I've seen what I've done.
I've shared a link in the show notes to. This chart from data center knowledge, it basically gives an idea of projected growth by hyperscalers over the coming few years. And we can see figures of maybe say Google like 3000 megawatts of assumed capacity right now, Microsoft at similar figures around the 2000 megawatt mart.
And same with Google. And you see that there's projections to double that over the coming few years. And I dunno if these are before or after the decisions that people have been making for this, because you've gotta remember that.
Asim Hussain: right.
Chris Adams: We've only just seen in less than a year did you see the Nvidia H1 being released and not just that Now AMD have come out with their own ex, the equivalent, they're competitor to this.
So you now see many more things available. So the question is, where are these gonna go and how are they gonna be powered? Like the worst scenario would be that people end up. Basically setting up data centers and then finding non-G grid ways of generating power to actually run these machines, like using diesel and stuff like that.
I'd really hope that doesn't happen, but I can imagine scenarios doing things like that or I. Possibly opening up some of the existing generation people that have shut down from cryptocurrencies. And since we saw this crypto collapse, and this is one thing that Tamara mentioned, she said it's worth looking at the role that LLMs are placing in the kind of public discourse.
They're very much filling that same role that the Metaverse was supposed to fill in or the other NFTs were supposed to fill maybe a year ago, for example. It's worth saying is that yes, we know that this is being seen as a drive of demand. Whether there are actual numbers that are reliable right now is another matter, and like we can point to various figures for demand and what people are looking at.
But at best, these are all currently like trailing indicators, like the Vantage report, which shows these figures. That's only based on what's already been set up. And that doesn't really tell us a story about are people stuck with buildings full of graphics cards that they can't plug in and sell AI for right now.
Because they can't get this stuff connected to the grid, for example.
Asim Hussain: I just also think that we just can't ignore the fact about the amount of money that is flowing into this space. And I hear the arguments for open source models and my heart really wants them to win out, but when it costs like a hundred million dollars to train up something like GPT three, which I think was the estimate.
And the real benefits come when you like compute even more and more. I just don't know if, if the open source models will win out because obviously people are spending money because there's an advantage to doing so. People wouldn't normally be able to peel off a billion dollars from a company just for no reason.
So it's, I think that there's lots of data we don't have. That's the data we do have, and that's telling us I something at least that there's a lot of,
Chris Adams: Yeah, I guess this is one thing that you currently do not have. Again, it's really hard to get a decent number about where things are going with this, for example, because I think I'm in the kind of camp where actually there is a lot of interesting stuff happening with open models where people are basically defining where they're gonna compete and try and come up with alternatives to this.
Like when you look at some of the purchases being made by companies which aren't just Microsoft and Facebook and Google, right. I think the example of Databricks is a really good example in my view, Databricks. They published a bunch of open data specifically to help build this competing ecosystem, and they recently purchased a company called Mosaic ML.
They published two in openly licensed LLMs. One's called MPT 30 B, and the other one called MPT seven B. These are the ones that you can run on our laptop right now. And they're large, like 19 gigabytes in size and you need a relatively chunky laptop, but you are seeing this and they are, if not comparable, you do see, I think there is, you do see a bit of an arms race right now, and it's interesting where things will go because we've linked to this whole Google has no moat. Sorry. Open AI has no moat and neither do we memo that I shared from Google before, but right now we're not quite sure. I don't know if it's a function of organizations just having loads of access to cheap money and doing this, or people seeing it like, oh, this isn't actually defensive in the long term.
Because now you see all these open models coming out. It might be comparable. It might be the case that if you can just get to the workloads already or get to the workflows, people are not gonna care that much in the same way that you know how loads of us end up using, sorry, loads of people end up using Microsoft Teams, not necessarily because it offers
Asim Hussain: about it recently.
Chris Adams: offers.
The best user experience as someone trying to join a video call. Right. It's more the case that there's a workflow that people have, or there's a way to bundle some of this in. I wonder if that's where some of these network effects are actually more important than essentially the training staff and all and it's, and that's where the levers are more. So, yes, AI is interesting and cool, but it's actually much more about market structure, antitrust and stuff like that. That's probably gonna be more of the drivers perhaps.
Asim Hussain: Oh, interesting. I see. It's not how powerful your model is, it's how you can integrate it into your existing business models take how can mon, how you can monetizing or even don't even monetize it, how you can use it to strategically win versus your competitors. And that might not even necessarily be like anything really to do with the raw power of the model.
Chris Adams: We've gone a bit off the initial question, but Yeah. But yeah,
Asim Hussain: chance, that's, that's a risk you have when you ask an AI question to this, to this podcast.
Chris Adams: Yes to people who with possibly questionable data access to information. Should we move on to questions? So for the next ones, these might be ones which are more solid footings for us. Okay. So the next question was one about basically time and space, location shifting and time shifting when people are talking about the idea of carbon aware software.
And the question is basically regarding location shifting, is the Green Software Foundation concerned that when everyone time shifts, To the same location or to the same greener grids that can increase the demand on the grid's energy, which should increase fossil fuel burning to meet said new demand.
Now, Asim, you might need to unpack this, first of all, for people to understand what happens with the whole merit order for that, because it's not immediately obvious if you're not familiar with grid workings. Why lots of people using computing in one part of the world will lead to more fossil fuels burning other than literally just it's a part of fossil fuels, for example.
Asim Hussain: That would be, let's say, if you had a frictionless capability to move your compute to anywhere in the world at any given moment in time, and you just picked. And every, and everybody in the world had exactly the same capability of frictionally moving their computer. Whatever's the greenness right now, it's probably gonna be France or somewhere in the Nordics.
And then what would happen is, let's say in the next hour, it's France. There's the greenness in the world. Every single bit of compute in the entire world would just move to France. Then those data centers would then, or if they can theoretically handle that load, will then suck up all that green energy, which made them kind of the greenest grids.
And those grids still need to make an energy for the people in France to boil their kettles and do all the other things they need to do. And so all they can do at that moment is burn stuff, typically burn stuff, you know, coal and gas and these things. Those are batteries. Those are chemical batteries. So you, they're the things that you can spin up.
Gas, especially stuff you can spin up very quickly and so effectively that just burns more fossil fuels. And so that's basically, that's what would theoretically, this is very theoretical, that would theoretically happen, um, is if you did, if everybody did just move all their compute over to, let's say France, France would be forced at that moment in time to burn more coal and gas.
Have I explained that correctly? Have I missed something out, Chris, or?
Chris Adams: I think that's about right. The thing it might be worth us talking about or sharing in the show notes is an article I put together called Understanding Energy Trends. At the layer below the internet stack, which talks about this, there is a kind of really nerdy techno concept called like merit order.
With the idea being that different kinds of energy have different costs, so things like solar and wind, once you've installed them, because you are getting the fuel from the sun, you don't have to purchase that extra sun to run it. So essentially the costs are almost free. So that's very low, right? Now there's other things which are designed to work, which are really redesigned, which are again, expensive to install.
But once they're installed, the fuel is relatively cheap for the amount of power they get out. And like nuclear in a good example of this, where you can get loads out that way. Now you have other kinds of fuel like say coal and gas and so on. But broadly speaking, the higher the cost of the fuel you, you have a trade off where things can respond more quickly in response to demand, but they're usually dirtier.
So the idea, that's the kind of the idea behind this, and I think the argument being made here is that if everyone moved all their computing to one part of the world, we would in induce all this extra demand, which could only be met. By things responding to the extra load on the electricity grid, which would usually be met by people spinning up really dirty, gas fired power stations or extra coal or stuff like that.
I think that's the argument and that's essentially the question. Now that we've actually understood, explained the premise, I should ask you what is the kind of official response to this? Let's, is that likely to happen? Is this the thing we should be aware of and how do we respond to that then?
Asim Hussain: My answer to this one is always, have you ever watched the TV series called The Wire? So watch
Chris Adams: Yes.
Asim Hussain: watch The Wire.
Chris Adams: not sure. I'm not sure where we're going, but go with this. All right? Yeah.
Asim Hussain: with this. It's about gangs and police like in New York City, but there was one of the latest seasons, there's a gangster called, I think it's Marlo Stanfield. I remember one of the episodes, like one of his workers is telling him something.
He turns around and goes, that's one of them good problems. And that's what I think about this thing. So someone's telling me a problem and I'm like, this is a good problem to have. If we are ever even remotely getting to the point where demand shifting is affecting a grid, that is a level of achievement, which is excellent.
Yes. Okay. Yes, there are negative consequences to that approach, but we are not even remotely there right now. So worrying about that is I think, a little bit too hyperbolic at the moment. You shouldn't do something because if you take that thing to the absolute extreme, it will be negative, I think is.
What I would say to this argument, I would say demand shifting is never going to be the one solution you have in your pocket to reduce your emissions of your application, your architecture, I always describe it as one of the things that you can do. It's one of the easier things to do. It gets you started on the much more challenging journey of energy efficiency, hardware efficiency, reducing the amount of energy you use, reduce the amount of compute you use.
But it gets you there. And I think that's why a lot of people have been interested in carbon air computing. I always say it's not going to be the solution that solves climate change. It's nowhere near gonna be that solution, but it's a stepping stone on the journey there.
Chris Adams: I think I would've asked that slightly differently actually. Now see, because when I see this question being asked, it's essentially will this demand cause people to do this? I think this has some assumption that people who are basically trying to move computing here, they're doing it because they're looking for kind of greener energy right? Now, I think there is one way that you can solve this purely from a just information point of view, and if you are looking for the lowest carbon intensity and you can see that the carbon intensity is increasing, then
Asim Hussain: I see. Yeah. Yeah.
Chris Adams: you would just choose to not run it there. Right. So this is somewhat dependent on organizations having some of this information published and visible for people, but I think that's actually something that can be done.
And the i e, even in places where this is not public information right now. So for example, in the uk this information is visible on a really clear basis right now, like the UK there is, I think there's a website called carbonintensity.org.uk, which publishes things on a very permissive license for this.
France already has its data available that you can pull out, so if you're gonna do this, then I think you would just look before you deploy something, or you would build some software to check if you're gonna, if you're gonna make it worse. That feels like that would be one of the solutions there, but that feels like a thing that is something that will be made available to people in a number of different ways.
Asim Hussain: You reminded me of a very interesting conversation I had with somebody from Google, cuz it created a service, started with v something and you pumped into that service. What then? I think it was the next day. So the next hour's workloads were going to look like. And then what the next, I'm gonna say tomorrow I think it's gonna be, I think it's day by day? But what tomorrow's carbon intense, the grid's carbon test is going to look like. And you actually pulled lots. You pulled, you didn't just. Didn't just like where's those carbon intensity? Push it there. You actually had a thought. You actually need to run quite a lot and I can't push it all in that one place.
Let's, let me be more intelligent on where I put it. And the idea was thrown out there. What if this service like existed? What if we all collaborated? What if we all said tomorrow, and this is not an unlikely to happen, but what if you all like was so open that we said, do you know what I'm going to submit to this online database that I'm need to run this much compute?
I need to run it tomorrow. I want to run in the greenest region, and it's scheduled it for everybody. And so you run in France, you run in Germany, you run in Norway, and we get it all together. That kind of openness, the data, I think would solve this problem as well, but I don't see anybody, any corporation being that open regarding their workload, future workload.
Chris Adams: The thing that you just described there, Asim, was essentially how energy markets work, right?
Asim Hussain: Oh,
Chris Adams: And they're regulated markets where there is not one owner. You don't, where you, rather than just having only the Amazon Cloud or the Microsoft cloud, you have multiple things, right? So in order for that to be possible, you would need to have a.
Kind of different structure or you need to have people who played a part in actually making sure things can be dispatched to the kind of correct actors in the role you'd need, like a feed in tariff for compute or stuff like that. I feel this is actually quite a nice chance to draw people's attention to a really interesting proposal that I felt from Adrian Cockcroft actually, who a, he's written a piece about why the whole kind of idea of trying to schedule workloads can be counterproductive, which I'm not totally sure. I agree with all of it because I think that whether you actually can see people shifting, creating that demand is another. I'm not sure if we have seen that, but I understand the kind of thrust of his argument.
But the kind of real time carbon footprint standard that he's proposing feels like it'll go a long way to actually providing the numbers that people would actually have access to or need access to realize, am I gonna really doing this? But the other thing you could do is to just actually like price carbon into the cost of cloud, right?
We already have spot markets. If you had in your spot markets, there was a spot plus, which basically had the cost of carbon at say a hundred Euros, a ton or a hundred dollars a ton, and you looked at that, then that would actually be a really, in my view, a fairly simple way to make sure you're not shifting all the compute to the worst places.
For example, like again, like if you're gonna go like totally neoliberal and price based, then yeah, that's how you can do this stuff. But the other thing you can do is literally just go totally whacky, just have massive batteries and data centers the way people look like they're doing for other significant drawers of electricity.
So for example, if you're looking at say, electric car charging stations, lots of them now have lots and lots of onsite storage to deal with the fact that car driving. It's really spiky, so most of the time it's not being used, but then people come to it and they need to pour a huge amount of power very quickly.
Likewise, like high-end induction hobs, which have their own batteries inside it, to again, deal with this big spike in use. There's lots of kind of strategies you can use, which don't meet mean that you don't need to actually burn fossil fuels for this. It does involve nerding out about the grid, and that's one thing which is new to a lot of actual people who are working with this.
So to an extent, I don't see it in the same way that you might see it. For example, Asim Cause I feel this is actually one that can be addressed using various techniques, people using other sectors which aren't technology, for example.
Asim Hussain: I will just say it's probably clouded from having many conversations with people about carbon aware computing being shut down because of versions of this question. Yes, car- we like the, but if it was to be taken to the extreme, then it would destroy our whole infrastructure, which I think for me, I've had this question and in phrase as a response to me as a reason for why we can't even entertain looking at carbon aware computing.
So that's where I, you've probably triggered my default defense mechanism against this question, but I, there's, I think there's a very nuanced, very important topic. I think it's important for you and me to have different opinions cause that's how we get or share all this important knowledge, uh, with the world.
Yeah.
Chris Adams: All right. Oh, there's one thing I should actually go add a bit of a kind of plug for. So there's some really interesting work from, I think it's Abel Souza and Noman Bashir. They've been publishing some fantastic papers talking about specifically the likely impacts of carbon air computing, what the savings could possibly be.
And it is a kind of quite a technical paper. Actually, when I read, I was like, oh wow, there's a lot of numbers in these charts. But it's really good. I found it one of the most useful ones for informing my opinion about where this goes, and I'll make sure that we share some links to this cause I think it'll actually add some extra nuance to this conversation.
The other thing to bear in mind is that if you just have this, then. Literally, it's not like large Hyperscale companies are not making enough money to buy batteries, right? If they're able to spend 70 billion on share buybacks in a given year, they can probably afford that. Buy is literally hundreds of megawatts of extra capacity.
You could just have five or six hours of local battery storage so that you wouldn't even need to touch the grid. You just run it locally if you want it to be sure that your power is green, but that's a separate discussion. So I'm just gonna park that once again cuz this is a, I'm worried about sound like a bit of a broken record on this one.
Asim Hussain: You do. I don't know where all this money is cause it doesn't find itself into my, into my budgets. But yeah.
Chris Adams: There was a report from the Rocky Mountain Institute who were doing some analysis on green Bitcoin and things, and they'd said, with 115 billion US dollars, you could buy up every single coal fired power plant on Earth and replace it with renewables and.
Asim Hussain: 115 billion?
Chris Adams: 115. Yeah. Last year, the combined share buyback, so the money made by Apple, Google, Amazon, and Microsoft as in they had so much money that they just thought, oh, I'm just gonna buy my own shares.
That was more than $125 billion. So single handedly. That could solve it in a single year, but we've decided to spend it on, do you know the problem with climate change? Shareholders aren't getting enough money back. I feel that this is the thing that we need to be talking about. If when we're talking about green software, it's like where is this money going?
Cuz we clearly have the money for this. It's just a case of priorities and we could be moving faster if we really wanted to. But that's again, I'll stop now because I'm a little bit ranty. I'm a bit worried. Sorry about that.
Asim Hussain: I want, I want everybody to know that the entire time I worked for Microsoft Chris. Adams would always type whenever he is talking to me on chat, would always type m dollar sign, always m dollar sign for Microsoft. But I dunno why you don't do, I'm an intel now I suppose. I suppose there's no, there's, could you euro you could do in Intel, but I don't know, I dunno.
Maybe there's another, maybe there's a reason why you don't put Euro, a euro signal into there for Intel. But anyway.
Chris Adams: think it's, cuz this was actually something that I, when I used to read The Register, when I used to, when I first came into technology and they used to call, I think IBM was called Big Blue and it was, there was another one with the beast from somewhere. And uh, yeah, Microsoft wears M dollar. Like
Asim Hussain: What was it? So
Chris Adams: because they made so much cash.
Yeah. It wasn't me being smart, it was like, oh, total second hand whim, mate. Yeah. They are very effective at basically
Asim Hussain: very, they, yeah. They've
Chris Adams: shed-loads of money. Absolutely.
Asim Hussain: yeah. That's what com, that's what corporations are there for.
Chris Adams: Do you know what? In India, right, I didn't realize this, but in India, every large company as a condition of working inside India, which most populous nation in on earth, something like 20% of all the profits of all have to be like by law in invested in what India considers like priority areas. So specifically into renewables straight away, right?
So there are all these mechanisms that people actually do that mean that we can direct funding to places to speed up action on climate. And if we're talking about technology and talking about carbon awareness and stuff like this, then we really need to be prepared to think about and have conversations about how much in the way of resources do we really wanna allocate towards what the science is spelling out and how much do we need to make sure that share price goes up, because yeah, okay. It's nice that people have like pensions and things and all that, but also it would be nice to have a livable world and having just this much money available feels, come on, let's like get this sorted.
Literally one year would solve it, but that's another discussion. Anyway. Let's move to the next question because I think people listen to, if they're trying to here to learn about code, not about economics.
Asim Hussain: we've turned into a, we started off with opinions on ai. We turned into an energy podcast, and now we're talking about capitalism. So let's just go, let's turn into a politics podcast. Let's just do it.
Chris Adams: I guess it's everywhere. Let's move to, okay, next question.
Asim Hussain: next question.
Chris Adams: Okay. The question, this was one about this idea of some machines and some software running in a low carbon mode, and the question basically came out saying, why not just run low carbon mode all the time? Not just when the carbon intensity is high on dirty electricity for bits of software.
This is essentially one of the questions that came from this. And I think, uh, this might be a reference to like things like Branch Magazine or even with your CarbonHack thing. One of the winning designs was, uh, software kit that would show different kinds of versions depending on how dirty the electricity was to stay inside a carbon budget.
Asim over to you.
Asim Hussain: It's an interesting question. Well, a, the user, I presume I, the way I've always imagined and the way you, in fact, you can use Branch Magazine. You can just go on Branch Magazine and say low, assume it's high. I dunno how the terminology, sorry, but low carbon mode all the time. I've always imagined these kind of UI modes in your system as something you can select if you wanted to or something we should auto select based upon that aspect of how, how carbon it is. So I, I've always imagined it's, it is user driven and I suppose if you've got a product and they, and you force yours into low carbon and the competitors doesn't, and all your users move over to competitors, then you've gotta factor that in as well.
But I also think, Chris, is there like an argument here about, does that stop money from going into renew? I don't think it does. I think the money will go to renewables.
Chris Adams: No, this basically, this isn't really about, uh, I don't think this
Asim Hussain: Yeah. Yeah. I don't think it was finance. Yeah. Yeah. Okay. Yeah.
Chris Adams: Okay, so the way that we designed this in Branch Magazine, cuz we were like playing around with this idea, we basically made it a thing that was user definable so they could choose to override this, but we would set a default to kind of. It was as much an education piece as anything else because for a lot of the time people aren't even aware that most of the time you don't even think about where the power comes from. So the idea that this is being foregrounded and the materiality is being exposed to you was the new idea for us. That was why we were doing it, to really emphasize this, because we thought this is a nice way to park back to some of the ideals of threat being something that's supposed to be open for everyone accessible and everything like that. So we figured if you design a low carbon mode, that kind of emphasizes the fact that the grid changes, but also at the same time emphasizes the fact that when you're using something, it should be accessible for people who cannot be who, who may be partially sighted or stuff like that.
Then you can embed some of these other values in how you build things to communicate different kind of sensibility. So I think the general answer is a lot of the time people do quite enjoy having quite rich experiences and having a kind of sober or monkish experience all the time might not be particularly compelling for lots of people.
And I, I think that's okay to actually be explicit about some of that right? You, I don't think it's realistic to think everyone only you ever wants to see some of this stuff or even make all those decisions for someone else. I think that might be a little bit too paternalistic, but that was my kind of take for it.
But there are things you could do to kinda hide this. You could possibly design it so that. When you build something, there's certain things, there are ways to provide a rich experience whilst reducing the kind of resource impact in the same way that you can refactor code whilst reducing the amount of computation it needs to consume to do something.
And I think that's the thing for it. But I figure like the thing you should probably do is just put it into the user agent or if you're look at, using a browser so you can have like a user agent of change where they decide this stuff themselves. You already have, do not track low bandwidth please. Stuff I think, stuff like that I think would be cool, but we don't have any browsers doing that yet, but this may be early days.
Asim Hussain: Yeah, yeah. Yeah.
Chris Adams: All right, question four. This question is about Jevons Paradox. All right, so this basically says the question is given Jevons paradox insight number four in the state of green software report, how do we change the thought pattern that more is more in technology?
Now, it might be worth just briefly explaining what Jevons Paradox is. Before we dive into this question, Asim, I could probably do it, have a go at talking about Jevons Paradox, if you wanna get ready for answering this particularly thorny question. Should I do that?
Asim Hussain: you do it? Yeah. You do a better explanation of it.
Chris Adams: Okay, so first of all, I'm gonna point people to the fact that this is the fourth insight in the report to buy some time.
But basically, Jevons Paradox is a name given to the phenomena where when you increase the re the efficiency of a particular resource, using any resource, you can increase the absolute use, even though individually it's more efficient. So this initially came from hundreds of years ago when William Stanley Jones noticed that making coal fired steam engines more efficient meant that more people used coal fired steam engines in new places, which would lead to an increase in the absolute use of coal.
And he was so worried about this that he thought we would run out of coal. So he started writing all these papers about, please could we not do this? This is terrifying. We're we are gonna stop progress if we make everything too efficient and hundreds of years later. This kind of applies with things like cloud computing and stuff like that, or it's often used as a way to say, you can't just talk about efficiency, you need to talk about absolute figures.
So if you make something more efficient, you just result in more use. And the common example is cloud. So by making hyperscalers talk about cloud being much more efficient. But the flip side of that is because it's suddenly more efficient, more and more people have access to it, which increases the absolute usage of this.
And we have seen absolute increases in just kind of technology and compute use. And I think that's one of the things that. Is what's inspiring this. But you also see it in things like ride sharing and stuff. There are examples of Uber and Lyft and other kinds of companies. When you make it really easy to hail a ride, you result in more people driving.
You increase the miles driven in a given city because it's so much more convenient. It's also subsidized by venture capital as well, which makes it e uh, which makes it cheaper than other options, but that's one of the impacts you have. But broadly speaking, making things more efficient is said to have a kind of rebound, which can increase the total use.
So that's it.
Asim Hussain: Yeah, I just think it's, part of it is really just, it's about resource constraint. You're, cause we were talking about earlier on, weren't we? I remember what the context was, but we were talking about if you're resource constrained, you have to make different decisions and you don't use that resource as much just cause it's just, I think it was AI, just cuz it's not there.
And so if you're resource constrained, if there's only 10 of something in the world, you'll just make choices that only use 10 of something in the world. But then if you make it 10 times more efficient, you'll still use all of the resources that you had and just use more of it. So I think that the argument here is as you're making things more efficient, that natural resource constraint, which was forcing you to make these trade offs and be not wasteful, at the very least, disappear.
Then you can just, you just start being wasteful. So I think the solution here is there has to be a constraint some way of, it's not, you're not gonna stop. I don't think we're gonna stop Jevons power. We don't think there's any way we can really force the world not to make things more efficient, just because that's what we're just absolutely engineered to do.
But what we need to do is to enact constraints, whether they're artificial, whether they're regulatory, whether they're some other aspect of it. We need to enforce that constraint and that's how we do it. Like for instance, when organizations set carbon targets and another kind of like targets to achieve.
Right now we have this kind of, it's okay, you can, we just, we'll just carry on increasing. It's okay for now, but there needs to be a real extra constraint, which forces you into those actions is, I think that's the only way. That we're really going to deal with this cause I, I see it right now, even with ai, like as AI gets more and more efficient, we're just gonna use it more and more to solve problems.
Inefficiently, but to, but conveniently. Yeah. Yeah. That's my answer.
Chris Adams: Okay.
Asim Hussain: Do I win?
Chris Adams: I think so I'm gonna ho, I'm gonna wait for the jury to be out and we'll put that to the listeners. That's what we should do. So basically, I think the main thing I'm getting from what you said there is that you do need to be prepared to talk about absolute figures here, and that's one of the key things, so
Asim Hussain: Well, I wouldn't, I wouldn't necessarily agree with absolutes. I would just say there needs to be another for, or not even just one, multiple other forcing functions to force your usage down, whatever that is. I don't know what it is.
Chris Adams: Oh, okay. I guess this is a little bit like when people talk about carbon budgets on websites or carbon budgets on services you've got, that's a decision that people have made to go for that. I have a bit of a struggle with this term because when people talk about Jevons paradox, it's often used as a kind of way to say it doesn't matter that you're, you are talking about efficiency because you, you are just gonna make it back.
And there is a kind of subtext which basically says, why are you even trying? It does feel a bit kind of "okay doomer" and if we look at the last say, couple of decades, yeah, we have seen. If we follow like the IEA, the International Energy Agency, what the energy people who look at how much power is being used by stuff.
They basically say that over the last, say 10 years or so, we've seen, we've seen a massive increase in the use of computing. But if you just look at the energy usage, we have not seen the corresponding at the same increase in the energy being used so that we use more computing. But the energy's been more or less level. Now, you can take up issues with those numbers because when you look at numbers that include, say China and stuff, like the numbers look quite a bit higher than what were from the IEA, but that's not peer reviewed and we can't really use those numbers yet. So they, there is some contention there, but I feel this also just ignores the fact that people have been moving faster than Jevons Paradox to keep things better than they otherwise would've been.
And I feel like when this is rolled out, a lot of the time it's not rolled out in a way that says, is there a 10% impact? Is it, do you get 10% of rebound? So if you're doing 20% of savings, there's a net saving here. And like without these kinds of numbers, I think it's actually, if it ends up being quite an academic and difficult thing to engage with and like, I think that's one of the struggles I have when we talk about some of this, because it's often used to either disincentivize people trying to make like honest and effective changes in the efficiency of stuff, or it's being used to, I know almost as a bit of a gotcha to say it's still so it's still doing this and I feel like, oh, congratulations. Oh, I'm really glad you told me that. We should be thinking in absolute terms about the climate, right?
I'm like, okay, yeah, surely we've established this years ago. So that's the thing. But this doesn't actually answer the question of how do we change the thought pattern that more is more in tech? I suspect what you said Asim was this idea like, I don't know. My assumption would be that you do need to actually be prepared to think about absolute figures and how do you stay inside those and you consider those a constraint in the same way that you might design something that has to be accessible. You say it has to be staying inside these kind of targets that need to be improving each year. That's what the ITU, which is the International Telecommunications Union and the Science-Based Targets Initiative. All these folks are basically saying, Yeah, the absolute carbon emissions of the ICD sector has to be halved by 2030.
And I think that's one of the things you might need to do is actually have some narrative that says, oh yeah, we're gonna halve our emissions by this much, and then how do we fit that into how we work outside of technology? There are ways to talk about some of this, cuz this is essentially, this is a little bit edging into the whole discussion about do you need to, is it growth first or is it, do we have to, do we target growth so we can have nice things or can we just aim for nice things automatically, directly.
Yeah, because there's things like, yeah, donut, economic, there is like typical kind of economic thinking, which is the thing we need to do is get really rich. And we might poison ourselves along the way and endure all this damage, but because we're so rich, we can then undo all that damage and somehow, I'm not sure that somehow unmake the extinct, all these things which we made extinct along the way.
That's one of the arguments around growth. So you have enough wealth that should pay for things, but schools have thought like donut economics and so on, and they basically say there's a social foundation of things everyone should have access to. There's certain kind of overshoot and like we can target making sure that everyone has enough of a social foundation.
By while staying inside this zone, if we just target that stuff first, if we think about the things you want to have immediately, so rather than just focusing on get rich first, we can focus on, let's make sure everyone has access to shelter, has access to healthcare, stuff like that. But again, it's like using needs is what you might think about it.
Like what user needs can you meet here that you're targeting first, rather than trying to grow for something large there but,
Asim Hussain: Yeah, I agree with you, but I just think it falls into that same bucket of arguing, which is, here's a bunch of ways the world could be better, and if we did it this way, wouldn't it be amazing? We would just have all these issues would be solved, and I'm always like nodding my head and going, that sounds beautiful, Chris.
I would love that world to exist. I just have. Absolutely no idea how it's even remotely possible to get there, given the way the current engine works. Actually, the only way, I'll tell you, the only way I think this will ever work, and this is the only way, and I think I, and I do think this is not gonna be something that happens by 2030.
I think this is something maybe our children might be able to, we'll be dead. We'll be dead by the time this, this can change. We're talking about like changing the Overton window, talking about changing the Overton window, like we are talking about A cultural change that will take generations is I think what it would really take.
I don't think this is gonna happen in our lifetimes. I think the world that you've described is a beautiful world. My dream, it will exist and I think it'll only exist if the culture changes and the culture changes worldwide and dramatically, and it'll only change if every generation comes along and like just shifts it a little bit and say, I'm of the generation, it was all the money for us, so everybody just wanted money.
It was like, what job gets you the most money? Okay, let's take that job. And I think hopefully I'm gonna raise my children to be a little bit different, to think different, to think what is the most positive impact you can have on this role? How can you be a better custodian of the planet? I think these kind of changes to drive, I think it's gonna take generations.
I really do. And I think it's something we actually do have to do,
Chris Adams: Or hand power to people who are not optimizing for the things that we, you and me might have been optimizing for, perhaps. Yeah. Okay. Okay. Christ, I've gone into politics again.
Asim Hussain: Yeah. We've done it again!
Chris Adams: All right. All right. This is our last question in the mailbag, and I think we might have to have another episode to answer some of these questions if there's interest.
We don't know if there will be. So are there any notable examples of organizations or projects that have successfully implemented green software practices? What can we learn from them? That's the question. And Asim, I'll put this one to you cuz I suspect you've had a few conversations with people doing some of this.
Asim Hussain: Yeah. I think that in terms of things that have been published and have got large enough to have significant impact, there's some of the stuff that happened at Microsoft, so the Carbon Aware Windows is I think one of the bigger implementations of carbon aware computing and I would, now that's a direct line between carbon aware windows to carbon aware Xbox.
I can draw that line for, oh, we need, still need to get those people onto the podcast if we can. So I think those are really great examples. And I, and I was actually Scott Chamberlain, who's my lead now at Intel, he's the one that was driving a lot of the carbon aware. Windows work for ages until he got that out.
And so that's like Windows now does carbon aware updates. I know sounds small, but updates like the type of workloads are very carbon aware and shiftable. So I think that's a really good example. Again, like very similar carbon aware work from Google, like earlier on. That was almost like two, three years ago now.
They did similar stuff with their data center workloads. I think the rest of the work that's happening in this space, there's a lot of work still happening right now in the measurement space. I know there's smaller bits of consultancy work that's going on with the larger companies. I don't know the specifics of, I don't think there's anything that we could re talk about publicly and also that would wow people, cause it's just the, the guts of people, organizations, and the work they do.
But I think to me, those are always two of the big wins in use cases. And it's always interesting me that those are both carbon aware examples because the investment you need to implement carbon awareness on the scale of which you could reach with it in a very short period of time is very impressive.
Chris Adams: Okay. Alright. Thank you Asim. Alright, so that's carbon aware programming. So there are examples that you can point to using carbon aware programming that create measurable figures from this. And that's like ones which are not particularly latency aware, but are convenient that happened in the background, which don't result in a kind of poor user experience, but still deliver some carbon savings, yeah?
Asim Hussain: The one thing I'll add to that I think is that what will drive more of the work in energy efficiency and kind of hardware efficiency in all these other spaces, I think is measurement. I think is ubiquitous, making these things very easy to function. I think though, I think the work that you've done with Co2.js And really making the ability to measure this stuff very, very easy is what's gonna drive a lot of the kind of the next generation of changes, and it probably already has, I just dunno any of the success stories there yet.
There's probably the loads of websites. Yeah. But,
Chris Adams: So actually this is a nice way to talk about, or a nice segue for what I think is interesting and there's, there's some work by company called Sentry Computing. So a talk at Grafana Con about basically how us tracking all the metrics for compute usage meant that we were able to reduce our own usage by X percent.
That was a cool thing in my view. But the reason I'm talking about this is not only has he shared some of that stuff, But he also ended up proposing a kind of proposed, a new HTTP response header specifically to the IETF as an RFC to basically say, this is how HTTP should work. We should bundle these numbers into HTTP so that there is a header for like carbon emissions, like Scope two or something like that.
Now, whether that's the correct number to use is another matter. But I think that's one of the examples of focusing on the efficiency part and the kind of resource usage and energy level. If we use the Green Software Foundation way of thinking about this, which is carbon, uh, and it's hardware efficiency, energy efficiency, those are the things I, I think the ones that are worth looking at.
There's also this other thing that, I dunno if the numbers are big yet, and I would love to get a second opinion, but there's a company called Storj. Which is S T O R J. They are basically making claims about massively reducing the hardware footprint of providing object storage, like S3 style object storage by using loads and loads of unused capacity in data center storage.
Just the same way that Airbnb can use unused capacity in houses and hotels and things like that. And they're basically doing this.
Asim Hussain: unused if they're using it?
Chris Adams: So the idea is that let's say you've got a bun, a loads of service, and they have free space on hard discs, which aren't being used by anyone right now. So they do that and they use this technique called erasure encoding.
So you take it to file, you split it up enough points so that you don't have to replicate the same file like five times if you just replicate enough of the overlapping shards of it. And are they, I'll share the link for that. Cause I think it's interesting. I don't know enough about the, what kind of peer review have you seen for the numbers, but I think it's extremely clever and it's one of the few examples I've seen of people doing something on the hardware efficiency part of green software that I think is cool and it ends up being sub substantially cheaper than using object storage from some of the big providers, for example. So it's, we're talking like around 20% of the price and the figures they say is it's maybe 70% lower carbon footprint for storing a terabyte of data over a year compared to some of the big providers or a data center.
So that's the stuff that I think is interesting, but I need to caveat that with that. I don't have any independent verification of that stuff yet, even though I think it's super duper cool. All right.
Asim Hussain: Chris Adams, always a scientist looking for peer, reviewing his statements; Asim Hussain always like shoots from the hip, whatever stat comes to his mind. There you go.
Chris Adams: Alright, Asim, I think this has taken us till the end of our time. We have allotted for our mailbag episode. I really enjoyed this. So thank you very much for coming on and, uh, we should probably wrap it up and say thank you to everyone for listening and we'll have more of the regular programming with interviews with for more experts coming up in the coming weeks.
All right.
Asim Hussain: Alright,
Chris Adams: Thanks mate. Take care of yourself. Have a lovely week and everyone enjoy. For those of you who do celebrate it in America, happy free from us Brits tomorrow here.
Asim Hussain: Commiserations to the Brits. Go back.
Chris Adams: Alright, leave it like that. Torah.
Hey everyone. Thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, Google Podcasts, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.
To find out more about the Green Software Foundation, please visit https://greensoftware.foundation That's Green Software Foundation in any browser. Thanks again and see you in the next episode.

  continue reading

86 episoder

Todos los episodios

×
 
Loading …

Velkommen til Player FM!

Player FM scanner netter for høykvalitets podcaster som du kan nyte nå. Det er den beste podcastappen og fungerer på Android, iPhone og internett. Registrer deg for å synkronisere abonnement på flere enheter.

 

Hurtigreferanseguide

Copyright 2024 | Sitemap | Personvern | Vilkår for bruk | | opphavsrett