Why not just "ML-ware"?
It's not specific to neural networks, corresponds closely to what most people would refer to as "AI" today, but explicitly excludes handcrafted algorithms. The resemblance to "malware" is serendipitous.
This is simple but surprisingly good, for the reasons you said. It's also easy to say and write. Along with fuzz-, and hunch-, this is my favourite candidate so far.
Brainware.
Brains seem like the closest metaphor one could have for these. Lizards, insects, goldfish, and humans all have brains. We don't know how they work. They can be intelligent, but are not necessarily so. They have opaque convoluted processes inside which are not random, but often have unexpected results. They are not built, they are grown.
They're often quite effective at accomplishing something that would be difficult to do any other way. Their structure is based around neurons of some sort. Input, mystery processes, output. They're "mushy" and don't have clear lines, so much of their insides blur together.
AI companies are growing brainware in larger and larger scales, raising more powerful brainware. Want to understand why the chatbot did something? Try some new techniques for probing its brainware.
This term might make the topic feel more mysterious/magical to some than it otherwise would, which is usually something to avoid when developing terminology, but in this case, people have been treating something mysterious as not mysterious.
I wasn't eager on this, but your justification updated me a bit. I think the most important distinction is indeed the 'grown/evolved/trained/found, not crafted', and 'brainware' didn't immediately evoke that for me. But you're right, brains are inherently grown, they're very diverse, we can probe them but don't always/ever grok them (yet), structure is somewhat visible, somewhat opaque, they fit into a larger computational chassis but adapt to their harness somewhat, properties and abilities can be elicited by unexpected inputs, they exhibit various kinds of learning on various timescales, ...
"tensorware" sprang to mind
because the goal here is to have a word that people skeptical of the "lifeyness" or "brainyness" of ai will accept to understand that it's not normal software, I really like "moldware" and will be using it until something sticks better. it nicely describes the general nature of function approximators without getting into the weeds of why or how, or claiming function approximators have inherent lifeyness. it also feels like the right amount of decrease in "firmness" after software.
more candidates to reject from, a few favorite picks from asking an llm to dump many suggestions: fit-; contour-; match-; mirror-; conform-; mimic-; map-; cast-; imprint-;
Mold like fungus or mold like sculpt? I like this a bit, and I can imagine it might... grow on me. (yeuch)
Mold-as-in-sculpt has the benefit that it encompasses weirder stuff like prompt-wrangled and scaffolded stuff, and also kinda large-scale GOFAI-like things alla 'MCTS' and whatnot.
Groware/grownware? (Because it’s “grown”, as it’s now popular to describe)
I don't really like any of those ideas. I think it's really interesting that aware is so related though. I think the best bet would be based on software. So something like deepsoftware, nextsoftware, nextgenerationsoftware, enhancedsoftware, etc.
I like "evolveware" myself.
Nebulaware ...
Hardware / software is a contrast between 'the physical object computer' and 'not the physical object computer' ... I do think that models are certainly 'not the physical object computer', and what we are actually distinguishing them from are 'programs'.
'Pro-graphein' etymology is 'before-write'. If we look for greek or latin roots that are instead something like 'after-write', in a similar contrast (we wrote the program to do the planned thing, we do the <x> to write the unplanned thing) we get options like 'metagram', 'postgram' ... unfortunately clashing with the instagram wordspace ... or 'postgraph'.
(Existing actual words with similar etymology to what we're looking for with this approach: Epigram, epigraph, metagraph - which arguably is weirdly close in meaning to what we want but would be confusing to override.)
Looking instead to 'code', going back to codex, caudex (tree trunk/stem)... this kind of still works, but let's go for a similar word - folium, folio ...
Alternately 'ramus'/'rami', branch, leading to 'ramification', seems a promising direction in a semantic sense. It has a lot of association with not explicitly planned developments and results. ('Ramagram' is kind of a silly possible word in English though. Then again, a lot of the AI development space has silly words.).
... More a starting point of ideas here than actually having dug up too many good-sounding words.
Going a step forward into the etymology of 'program', it comes to mean 'write publicly' or 'written notice', which we could also contrast with roots meaning something else like 'idi-' from 'idios' for 'private, personal, one's own', or in fact 'privus' itself. (Again need to keep clear of actual existing words like 'idiogram').
@the gears to ascension , could you elaborate on what the ~25%
react on 'hardware' in
Would it be useful to have a term, analogous to 'hardware', ...
means? Is it responding to the whole sentence, 'Would it be useful to have...?' or some other proposition?
that was due to a bug in how lesswrong figures out what text a recorded react applies to. I'm not sure which react that was supposed to be, but my reacts weren't valuable enough, so I simply removed them.
Would it be useful to have a term, analogous to 'hardware', 'software', 'wetware', 'vaporware' etc.[1], which could be used to distinguish learned/discovered components of software, like gradient-trained DNNs, prompt-hacked LLMs, etc?
EDIT 2024-01-04: my current favourites are 'ML-ware' (HT Shankar), 'fuzzware' (me), and 'hunchware' (Claude), in that order; LW votes concur with 'ML-ware'.
In a lot of conversations with nonexperts, I find that the general notion of AI as being 'programmed' apparently still has a surprisingly strong grip, even after the rise of ML and DL made it even clearer that this is an unhelpful anchor to have. Thane recently expressed similar, quite strongly.
David Mannheim has a short take AI is not software which I think nicely encapsulates some parts of the important distinctions.
The important thing, for me, is that, in contrast to traditional software, nobody wrote it, the specification is informal at best, and we can't (currently) explain why or how it works. Traditionally, software is 'data you can run', but traditionally this class of data were exclusively crafted (substantially) by human design.
A valid answer to this question is, 'no, we do not need such a term, just say, "learned components of software" or similar'.
In practice, we probably wouldn't apply this term to, say, a logistic regression, but maybe?
Some ideas, none of which I like enough yet
After a bit of back-and-forth, Claude managed to produce a few which I think are OK but I'm not very sold on these either
For some illuminating compendia of -ware terms, see wiktionary, computerhope ware jargon, Everyware from rdrop, or gears' shortlist of suggestions. Notably, almost all of these are really semantically <thing>-[soft]ware with the 'soft' elided e.g. spyware really means spy-software. ↩︎