Thank you, Jeff. That's a really interesting article.  

Sent from Gmail Mobile


On Fri, Jan 17, 2025 at 14:34 Jeff Kaufman via Contra Callers <contracallers@lists.sharedweight.net> wrote:
Hi Louise,

I started writing up a long response, then decided it was a better fit for a blog post, then realized someone else had already written the blog post I wanted to write: https://andymasley.substack.com/p/individual-ai-use-is-not-bad-for

The overall point is that querying an LLM uses very small amounts of both energy and water, much less than many everyday activities.

I do think caution around AI is justified, and that preventing AI-driven catastrophes is one of the most important problems to work on.  But this isn't about energy usage or environmental impact, and I don't think we should be discouraging others from using AI tools to help with formatting or similar, as Rick was doing in his first message.

Jeff

On Mon, Jan 13, 2025 at 6:41 PM Louise Siddons via Contra Callers <contracallers@lists.sharedweight.net> wrote:
Jeff,

While I think derailing completely into a bibliography of the environmental impact of AI is inappropriate for this list (and despite your generous acknowledgement that you may have missed something, I’m sure you’re as capable as I am of reviewing the literature), your skepticism is on-topic enough in context for me to say some things and then be done:

First, the historic issues around water provision and grid-derived power supply around data centers in the midwestern US offer some context for more recent discussions about (all contemporary) tech and energy. Second, pervasive discussion of nuclear energy as a useful “new” energy source gives some indication of the amount of power that emerging systems need. The articles I’ve read for work that review the environmental impact of future computing have so far relied either on the argument that nuclear is clean energy (and the 1980s might have something to say about that; it should at least be a public discussion rather than a private one), or the assumption that the technology itself will produce new efficiencies or solutions at some point faster than we otherwise would that will make it all okay. For me this latter argument relies a bit to much on the optimism of people who have directly contributed to many of the problems in the world today, and/or the philosophies they espouse (and in fact my primary conclusion overall has been that it’s shockingly hard to get good data on this question, and that in itself should prompt closer examination on all our parts).

There are also social and cultural reasons to be cautious about AI, as it is being developed quickly and without significant ethical oversight — but they really are beyond the scope of this discussion except to say that I think the human environment is also worthy of concern.

To Michael’s point earlier, some people may like to know (minuscule impact or not) that you can use “-ai” in your Google searches to stop it from giving that AI-generated summary at the top of search results. A bit like my Amazon boycott and my personal choice not to have a car, it’s a futile gesture in the grand scheme but one that feels right to/for me, as it’s a “feature” I didn’t ask for and don’t need, and which I see causing harm to/for others.

Louise.


> On 13 Jan 2025, at 22:20, Jeff Kaufman via Contra Callers <contracallers@lists.sharedweight.net> wrote:
> 
> Louise, Keith: when you say that querying LLMs like this is an "environmental disaster" or "bad for the planet", what are you referring to?  Most claims I've seen along these lines don't hold up at all when you start looking into the sourcing, but I might be missing something?
>
> Jeff
_______________________________________________
Contra Callers mailing list -- contracallers@lists.sharedweight.net
To unsubscribe send an email to contracallers-leave@lists.sharedweight.net
_______________________________________________
Contra Callers mailing list -- contracallers@lists.sharedweight.net
To unsubscribe send an email to contracallers-leave@lists.sharedweight.net