Expand Cut Tags

No cut tags
pnictogen_wing: (Default)
[personal profile] pnictogen_wing
we would all like to get back into being better friends with computers. learning programming seems like a necessity if we're to survive the next several years because I have a feeling the landscape of personal computing is about to shatter.

we've been trying to help in the shattering process, I admit. Mono the Unicorn has been kicking away at the credibility of the "large language model", which seems like a cosmic joke of a technology, the world's most expensive Burroughs Machine. but people really do believe in it, and that's kind of terrifying actually. I'm quite prepared to believe that a lot of computer jockeys who feel like the Machine God is about to burst forth from their gibberish generator are shocked and amazed for the simple reason that they're seeing scraps of text they would never otherwise read. they're such limited people with limited intellects and a practically subliterate degree of language use because they're speaking a kind of street poetry or patois so liberally festooned with memes that you practically don't NEED to talk. it's actually sort of cool, but it's also rather obvious these people don't know how their machines work. so many layers of abstraction have been heaped atop the personal computer that these techie people plainly regard "the computer" more like a force of nature than a physical object. memory? electricity? data? surely these things merely flow like water or nitrogen.

in a way, that's delightful! fiction has met fact, in a way. where do you find such highly abstracted and stylized depictions of how computers work? in movies and games and comic books and fiction! this is how people talk about computers in stuff like Tron or Hellblazer, as if data and memory were substances, stuff. they certainly can be (in broad approximation) treated that way. but the real world is a place of infinite subtleties and these have all escaped the notice of the high-tech crowd. if they're bad at programming it's because at some level they don't even really know what a computer program is any more.

that's charming. they might even be as bad with computers as I am, despite all their bluster.

they're certainly not good with math. it's quite obvious in a hundred little ways that these programmer dudes have a mystical, innumerate sort of approach to numbers. they're numerologists though not honest ones. large numbers quite escape their grasp, but they're dazzled and impressed by them; small numbers tend to fall completely out of their sight. they love percentages so they have a habit of pretending that any fractions smaller than 0.05 or even 0.1 must not mean anything. Pfft, 5%, that's NOTHING!

anyway it would be pleasant to get that old feeling of facility back. I may have come to feel like my faith in the personal computer (it's sad to think that I did in fact HAVE one but I did) was betrayed, and thus conceive the sort of festering vengeful sense of offended justice that Emiya Kiritsugu once held for heroism. It's curious that our paths should have crossed as they did, and that we should have had so much in common, including a child's faith in a just Universe.

Apple Computer, most of all, has been like some Evil Empire in my mind, which is a bit silly I grant you, and yet...I can't let go of the feeling that they did in fact poison their tempting apple. they held out the promise of something that eventually they grew tired of trying to offer, so they settled for being COOL. but it's more than that.

think of what they did to George Orwell's 1984...they pretended it had a happy ending.

~Chara of Pnictogen

Date: 2024-09-24 05:38 am (UTC)
mathsbian: a white body with a carnation as a head. they are wearing a blue plaid shirt under a maroon jacket and are holding a blunt (Default)
From: [personal profile] mathsbian
Some very apt observations about LLMbros here

Date: 2024-09-24 06:51 am (UTC)
mathsbian: a white body with a carnation as a head. they are wearing a blue plaid shirt under a maroon jacket and are holding a blunt (Default)
From: [personal profile] mathsbian
I feel similarly, but I don’t even know what I would use an LLM to *do*. Even just as a practicality/accuracy/tone test, people who know more about those subjects have already done tests of their own. I definitely see the appeal of the art generators, I think I would end up fiddling with one forever trying to get my dnd PCs accurately portrayed. So I hold myself back on that. I can dig for the perfect picrew instead.
Edited (Left out my last sentence, brain fart) Date: 2024-09-24 06:52 am (UTC)

Date: 2024-09-24 11:33 pm (UTC)
malymin: A wide-eyed tabby catz peeking out of a circle. (Default)
From: [personal profile] malymin

I tried talking to some members of AWAY about it (anti-techbro AI art collective, effectively), and different people suggested the following:

Depends on what you're trying to do, honestly. "ChatGPT at home" is still a pipe dream and probably will remain so for quite a while, but local stuff can cover personal entertainment chatbot purposes pretty well these days However, the development of local text models has always been for one specific purpose (ERP aka porn), so the best models for chatbots tend to be focused explicitly on that With enough memory (RAM mostly, but GGUF models can offload layers into VRAM), KoboldCPP, and the SillyTavern frontend you won't have to pay CharacterAI, NovelAI or any other commercial services of that nature

Another person:

this is kind of a real problem because there's no easy way to run an llm afaik I think Pinokio or LLM Studio or Ollama are the easiest ones LLM Studio lets you simply download like Llama 3.1 7b to your pc (in response to someone suggesting Kobold) Yeah but the easiest way to tell someone who's concerned about the environment to try an LLM is to simply run them in their PC

A few other people:

[Kobold ai is] not local but it's running entirely on gpus provided by volunteers

i found koboldai and stable diffusion mostly easy to install but im a mildly competent nerd type

i think you have to bite the bullet and maybe fuck around in DuckDuckGo's AI chat (easy free private way to use Claude etc) and then if you're interested, install something locally

IDK how helpful any of this is. I get the impression that running a local, offline instance of a LLM or a image generation model (Stable Diffusion being the most frequently mentioned of the latter) is basically equivalent to running a resource-intensive computer game, though requiring more technical know-how to install.

Profile

pnictogen_wing: (Default)
pnictogen_wing

October 2024

S M T W T F S
   12 345
67 891011 12
13141516171819
20 212223242526
2728293031  

Most Popular Tags

Page Summary

Style Credit

Page generated Apr. 10th, 2026 09:37 pm
Powered by Dreamwidth Studios