pnictogen_wing: (Default)
2024-09-24 01:58 pm

ending the "endless September"

Who here has read John Kennedy Toole's A Confederacy of Dunces?

It was one of my favorite books in the 1990s and I'm sure I'll love it
just as much when I re-read it (eventually) because I regarded it as a
moral warning, a milepost of sorts: Don't Be Like Ignatius V. Reilly. C.
S. Lewis talked about his moments of Joy or sehnsucht in
Surprised by Joy and I agree with him fully; such moments are
important—and Jack Lewis should have asked himself why he
stopped having them, even though he wasn't anywhere near Heaven
yet. But I've come to realize that there's a logical converse to such
moments: the times when you realize you've strayed too close to the Pit
and maybe you should back away. A Confederacy of Dunces was like
that. Reilly was too familiar for comfort. He was stagnant, soured,
morally and intellectually rotting in place, and as it turns out he also
predicted the future. The Internet is overflowing with Ignatius Reillys
and most of them call themselves "dark intellectuals" or something
similar. At some point in their pasts, as with Reilly, they decided
never to grow up: they chose some moment of dark epiphany to fixate
upon, some moment when they realized they were the only sane person in
an insane world, and they haven't budged a millimeter from that spot
ever since. I remember reading A Confederacy of Dunces in the
mid-1990s and thinking, oh gawd, let us make more use of college
education than THAT.

The "dark intellectual" people and the antisocial techbros who eat up
their stuff love to talk about their "redpill" moments, when they
supposedly realized that feminists had ruined the world or whatnot. Bret
Weinstein, who's peddled TERF diatribe and Sinophobic "theories" about
COVID-19 and is now claiming to be Saving the RepublicTM on a
speaking tour with a bunch of other propagandists, has a particularly
hilarious such moment: when he was fired from a teaching job at
Evergreen State College here in Washington State for being too bigoted,
he declared this was evidence that Evergreen was the secret headquarters
of a vast leftist conspiracy to corrupt all education or something like
that. (He's blithered about this at length and you can learn all about
it on YouTube if you like.) As it happened, Ignatius V. Reilly had a
similar moment: he bused to Baton Rouge to apply for a teaching job at
Louisiana State University, flubbed the interview, and then decided that
this experience was a trip into the Heart of Darkness of modernity.
Reilly would tell this story of dark awakening to all and sundry, and
write extensively about it into foolscap tablets in his bedroom at his
mom's house. Now, though, you can put that stuff on the Internet, and
get paid for putting it there.

If there's any ONE event that gets the "dark Enlightenment" people
worked up, though, it's the endless September, the day when the
Internet was finally too public and commercial a thing to remain the
exclusive domain of universities and .mil accounts and that sort of
thing. There was a long enough interval when the nascent Internet was
the exclusive playground of college students and military contractors
for a pecking order to develop between wise professional greybeards and
clueless college freshmen joining the party late (like I did) and thus
contributing to a September rush of "dumb" and "moronic" newbies on
mailing lists and Usenet. But then when there were enough people getting
Internet accounts through corporate outfits like AOL, round the clock
instead of clustered round the school schedule, that meant an "endless
September" of newbies at all times of year. It's quite clear that
there's a lot of rancid resentful nerds who still think of this as the
End of the World, more or less, the day that the barbarians arrived at
the gates. After all, nobody represents civilization better than a
racist computer nerd still waging Mac v. PC wars.

I'd love to kill this bit of toxic nostalgia stone dead, if I could.
I've experienced a bizarre reversed version of it: I came to hate
computer nerd culture so much that I aggressively took the part of the
unsophisticated user, partly because one of my best friends IRL is a
very old-fashioned gardener born in 1951 who NEVER got used to this
stuff even a bit and still prefers to talk on the telephone. I've helped
him out with computer stuff and shared his anger: why is this stuff so
confoundedly hostile and overcomplicated? It's not fair to make someone
like my friend deal with a labyrinth of bad choices like the modern-day
website or recent Windows versions, much less the fucking smart phone.
(He refuses to get one. Can you blame him?) "Endless September" now
seems merely like the reification of the casual bigotry of toxic
computer geeks, the ease with which they divide everyone up into the
[slurs] vs the high-IQ, more "evolved" human beings, hoi polloi
vs. hoi aristoi.

It's not like they even respect that era of computing anyway, not
really. Oh they still spout out sentimental glurge about it but in
reality they're happy to have left it behind. It's safely in the past
for them, like Napoleon or Julius Caesar, and therefore safe to
mythologize.

~Chara of Pnictogen
pnictogen_wing: (Default)
2024-09-23 09:58 pm

computer complaints and the poisoned apple

we would all like to get back into being better friends with computers. learning programming seems like a necessity if we're to survive the next several years because I have a feeling the landscape of personal computing is about to shatter.

we've been trying to help in the shattering process, I admit. Mono the Unicorn has been kicking away at the credibility of the "large language model", which seems like a cosmic joke of a technology, the world's most expensive Burroughs Machine. but people really do believe in it, and that's kind of terrifying actually. I'm quite prepared to believe that a lot of computer jockeys who feel like the Machine God is about to burst forth from their gibberish generator are shocked and amazed for the simple reason that they're seeing scraps of text they would never otherwise read. they're such limited people with limited intellects and a practically subliterate degree of language use because they're speaking a kind of street poetry or patois so liberally festooned with memes that you practically don't NEED to talk. it's actually sort of cool, but it's also rather obvious these people don't know how their machines work. so many layers of abstraction have been heaped atop the personal computer that these techie people plainly regard "the computer" more like a force of nature than a physical object. memory? electricity? data? surely these things merely flow like water or nitrogen.

in a way, that's delightful! fiction has met fact, in a way. where do you find such highly abstracted and stylized depictions of how computers work? in movies and games and comic books and fiction! this is how people talk about computers in stuff like Tron or Hellblazer, as if data and memory were substances, stuff. they certainly can be (in broad approximation) treated that way. but the real world is a place of infinite subtleties and these have all escaped the notice of the high-tech crowd. if they're bad at programming it's because at some level they don't even really know what a computer program is any more.

that's charming. they might even be as bad with computers as I am, despite all their bluster.

they're certainly not good with math. it's quite obvious in a hundred little ways that these programmer dudes have a mystical, innumerate sort of approach to numbers. they're numerologists though not honest ones. large numbers quite escape their grasp, but they're dazzled and impressed by them; small numbers tend to fall completely out of their sight. they love percentages so they have a habit of pretending that any fractions smaller than 0.05 or even 0.1 must not mean anything. Pfft, 5%, that's NOTHING!

anyway it would be pleasant to get that old feeling of facility back. I may have come to feel like my faith in the personal computer (it's sad to think that I did in fact HAVE one but I did) was betrayed, and thus conceive the sort of festering vengeful sense of offended justice that Emiya Kiritsugu once held for heroism. It's curious that our paths should have crossed as they did, and that we should have had so much in common, including a child's faith in a just Universe.

Apple Computer, most of all, has been like some Evil Empire in my mind, which is a bit silly I grant you, and yet...I can't let go of the feeling that they did in fact poison their tempting apple. they held out the promise of something that eventually they grew tired of trying to offer, so they settled for being COOL. but it's more than that.

think of what they did to George Orwell's 1984...they pretended it had a happy ending.

~Chara of Pnictogen
pnictogen_wing: (Default)
2024-09-21 10:43 am

maundering about computers and the web browser, writing posts out in external editor

it's been recommended to me many times that I break myself of the habit of doing everything in the web browser, which of course is the pattern of usage that web browsers and web developers have been encouraging for decades now. folks have been pointing out for a long time now that the web browser, which seemed like something new and amazing back in the NCSA Mosaic days of early 199x—oh dear gods was that actually Marc Andreessen who did that, gross—has transmogrified into a bloated miscreation, a kind of half-assed virtual machine that for lots of personal computer users has become the only way they interact with anything, through web applications and assuming that "the cloud" will simply keep all their data for them.

do people NOT notice how telling the names of these things are? "the cloud". how permanent are clouds? do you trust information you see written in clouds? (*sighs*) anyway

despite decades of experience with personal computers I've never developed much genuine facility for them, thanks to the intensity of the visceral and irrational loathing I've developed for the entire industry. but loathing of such vehemence stems from feelings of betrayal: I despise modern computing because at one time I was naïve enough to put all my hopes into it. there was an interval of childhood where computing really did seem like magic (and also something I felt my father was cool for knowing something about, in his older-fashioned way) so watching that old magical promise shrivel up under corporate misrule made me feel like I'd been tricked, led astray. by 1995 or so I could legitimately feel like computers had ruined my life because of how much time I wasted on them during my failed Caltech undergrad. but even then the magic hadn't completely gone out from them and I could still hope that maybe there was a future for me in learning to program computers and make money in software.

then I moved to Seattle in late 1999 to pursue that dream, and by late 2001 I was out of the industry altogether, for good. yay me

anyway thanks to this unpleasant set of experiences I've utterly failed to develop the kind of easy relationship and swift workflow that computer geeks experience on their machines. my computing habits have been toxic ones. I've alternated between spells of manic hyperfocus and overactivity on computers (probably coming from various introjects hidden deep within the Pnictogen Wing, seizing control for some specific activity) and intervals of loathing and avoiding computers altogether, seeking the solace of friendlier tasks like reading or watching movies or cooking. and in general I've stuck to the lowest-resistance methods of using the personal computer, i.e. I've behaved like an "end user", an unsophisticated consumer of computing using a bare minimum of mass-market applications. so, like any housefrau or clerk or schoolkid who uses computers mostly because it's expected of them, I've been limiting myself to common web applications and using them in the expected way. open a browser, go to the website, type away.

that's a poor idea in practice because one of the most reliable traits of web applications is unreliability from multiple directions. even the best designed website can still be defeated by a browser crash or an Internet outage, after all, but more to the point: it's difficult for a web application to deal with interruptions properly. a native application can easily "save state" and recover easily from a crash, but a web application can't easily do that, so most don't bother. if the website suddenly bombs while you're in the middle of typing deathless prose, just like I'm doing right this moment, welp that's your fault isn't it? you should have been more careful! and anyway you should be grateful you get to do anything at all on a computer, you [slur], I bet your IQ is [get bent]. if you want something better program it yourself, etc.

I trust I've made my point. making software labyrinthine and unreliable has become almost a point of pride with toxic computer geeks, evidence of "intelligence" and a way to screen out the "dumb" people. if you're a "power user", i.e. someone willing to pour a ludicrous amount of wasted time into ferreting out and reverse-engineering the hidden secrets of software which shitty programmers like to put into their shit, then you've got something to brag about. additionally, the programmers are highly likely to be better screened from the consequences of janky and unreliable computing equipment and software. they have the money for the highest quality toys and generous amounts of free time to get everything working to their satisfaction. the ordinary user who wants simply to use a tool rather than turn a ten-minute task into a weekend project in recompiling their Linux kernel gets no respect. hence we're forced to muddle along with semi-functional software.

you'd think I'd have learned my lesson with web applications then and done what (say) my metamour Gravislizard habitually does, which is write all their posts in a text editor first. but I have yet to develop such a habit. even text editors don't seem fun any more, or pleasant to use.

~Chara of Pnictogen
pnictogen_wing: (Default)
2024-09-18 10:46 pm

there is a thing we're trying to do here

and that's give ourselves time to be awkward with unfamiliar software. it seems like it's been forever since I felt like I could just...take in the computing experience, instead of being wrapped up in this eternal war with the enveloping Machine. there was a time when these things seemed friendly.

that spirit never disappeared exactly, but it's fled from the United States and "the West", which has gotten poisoned with the values of people who think of personal electronics and computer software as something to invest in and brag about, and who don't quite care whether it's any good or not. I've been haunted by all that and still seek to free myself from the noxious influences of decades past, influences which serve to obscure the true value of the marvels that I had seen in childhood.

Computers once did seem *extremely* magical. And then they were sin itself for a while, and I fled the Machine and sought safety in other pursuits, other disciplines. Working at Goodwill was (for a time) preferable to the Machine. I got to see the Sun and the sky and birds and other things, just enough to keep me going, until I got pulled in. I got suckered.

I don't know what I'm doing with my capitals. All I know is...I think something amazing has happened, and I can finally recover what I'd lost. The heartbreaking thing is realizing that all those Twitter people and Elon Musk himself are chasing the very same thing, the very same spark, that I once thought I saw in the possibility of personal computing. They don't know what they're doing, and I don't know what I'm doing either! But I feel like maybe it's within reach at last. I can stop fighting with these machines, who are themselves rarely to blame for the troubles they cause. (The Machine is a different matter.)
pnictogen_wing: (Default)
2024-09-17 09:48 am

slipping into labyrinths between dimensions

There's an idea that I've been trying to piece together slowly. I've been trying to figure out what it is, exactly, that's so bamboozling and confusing about being on a personal computer. Or a smart phone, or tablet or whatever.

For a while I thought: oh, it's the unnatural light. No matter what screen technology you're using (with the exception of e-paper, whose physical nature makes it uniquely pleasant on the eyes) the light that comes from a computer monitor or similar screen isn't like the light you'd get from usual Earth objects. More and more of the light sources in widespread use in human society are "unnatural" in this sense. The mammalian eye is used to continuous sources and continuous spectra and colors that aren't too saturated. Technology is required for sources that have narrow emission bands, or which are intermittent or oscillating. So it's kinda weird to stare at a screen. Is that the only issue, though?

There's also the fact that objects on a monitor have a blurriness or jagginess that isn't usual for physical objects. Text on a screen is always a bit annoying to read, and I don't think anti-aliasing helps (rather the opposite, with me anyway). Physical objects have a sharpness of definition that's missing from texts and other objects on screen. I remember hating the widespread introduction of anti-aliasing into OS X and later releases of Windows; it felt a bit like I was being made to squint through a thin layer of vaseline spread over the screen.

But there's a more important piece of this idea I'm less clear on, because we're not good with the math and geometrical concepts necessary to understand the nature of the beast, so to speak. I'm referring in general to how the presentation of information on computer screens, in overlapping rectangles that behave in eccentric and counterintuitive ways, has created a bizarre sense of interdimensional space. One can, on a computer, slip into a realm that has some notion of depth and direction, as if one were stepping into a *physical* 3D space, but in fact it's a chaotic mess, a labyrinth of passageways that presents the superficial aspect of a simple screen—pixels in a plane.

It hadn't occurred to me before how pseudo-3D shooters also exploit the ability of the computer to display the appearance of paradoxical spaces. They look *locally* like ordinary hallways or whatever, but in fact they're self-intersecting and connected up with each other in strange non-Euclidean ways. You know, like in R'lyeh! Gamers have simply gotten used to navigating such paradoxical locations so long as they look superficially acceptable to the eye...and I'm not sure that's really a good thing.

I am reminded uncomfortably of the appearance of the Witches' Labyrinths in *Madoka Magica*, which have something of the appearance of proper three-dimensional spaces, with depth and direction, but which follow their own confusing rules and are dominated by *flat* images. I suggest that without knowing it, computer programmers have led users into a paradoxical space that is neither two dimensional nor three dimensional, a space where people can be given the *illusion* of progress and motion without actually going anywhere. And now a large fraction of the computer-using population is so used to this state of affairs, the ordinary world now seems wrong to them.

~Chara of Pnictogen