AI isnāt a shortcut for thinking. In her guide for skeptics, Hilary Gridley reframes AI as a collaboratorānot a replacement. Use it like spellcheck for your thoughts. Donāt fear itāiterate with it. Insight improves, speed follows. Full post: https://hils.substack.com/p/the-ai-skeptics-guide-to-ai-collaboration
#pdp7oxq
(#pdp7oxq) But it is still a giant inefficient use of resources and energy š¤£
#6mcdmqa
(#pdp7oxq) @prologic@twtxt.net Since you have to check and double check everything it spits out (without providing sources), I donāt find any of this helpful. Itās like someoneās in the room with you and that person is saying random stuff that might or might not be correct. At best, it might spark some new idea in your head and then you follow that idea the traditional way.
Information published on the internet (or anywhere, for that matter) was never guaranteed to be correct. But at least you had a āframe of referenceā: āAh, I read this information about Linux on a blog that usually posts about Windows, so this one single Linux post might not necessarily be correct.ā That is completely lost with LLMs. Itās literally all mushed together. š¤·
#brit4ya
(#pdp7oxq) @movq@www.uninformativ.de Yeah I couldnāt agree more. The utility of using it in any way to form ātruthsā or to do anything that require a high degree of āaccuracyā is utterly pointless.
#tebwz7q
(#pdp7oxq) There are other tasks LLM(s) are far better suited for, which are also its downsides, and gawd so expensive and unrealistic to run yourself š¤¦āāļø Do you know what one of these NVIDIA H100ās cost? š² Thatās right! 𤣠> $50k USD š± And many of the models out there require 8 of these suckers 𤣠Each one consumes around ~400W of power each (not including the machine that houses them!)
#r5ihb6q
(#pdp7oxq) @prologic@twtxt.net Hmm, speaking of locally running āAIā stuff: Someone on Mastodon has this in their profile description:
My profile pic is AI modified to prevent deepfakes. I used local Stable Diffusion on my solar powered 7900XTX to average a few selfies.
That sounds like a fun thing to do. Do I have a chance of doing that on my old box from 2013 without a dedicated GPU? š
#crqhaoq