The first example that came to my mind for a recent notation that has caught on in the field would be siteswaps in juggling. It was only invented in the 1980s. I am a juggler and can confirm that all the technical juggling nerds know what this is and it is used in crazy tricks. For example see 5551 below, which I heard was the first trick that was found through the notation:
Juggling lab is a software for rendering these.
EDIT: I probably misremembered with 5551, the Wikipedia article mentions 441.
Many popular languages today (notably the C family) ultimately descend from ALGOL, which is from 1958.
"Structured programming", i.e. writing code as syntactically-delimited blocks, functions, and procedures rather than with numbered lines and GOTOs, was pioneered in ALGOL.
Popular languages today such as Python, Java, JavaScript, Go, and Rust may diverge pretty widely in features (and syntax), but all of these are ultimately ALGOL descendants; albeit with influences from other language families too.
(If your language has for loops, it's an ALGOL descendant.)
Lisp and Fortran are also pre-1960.
Simula (and thus object-orientation) is from '62, but influenced by ALGOL. Smalltalk is a Simula descendant. C++ is what you get if you try to build Simula ideas on top of a C compiler (and go a bit gaga for operator overloading).
There are some languages a little later than that, that look pretty different. For instance, APL is from '68. Forth is from 1970. ML, which gave rise to Haskell, is from '73.
I thought "new notation" included new symbols. Almost all programming languages exclusively use ASCII characters for their keywords, which are pretty old.
many of these are skeuomorphic. perhaps it can be argued that they have history from the 60s. but at this point the digital interfaces have supplanted any real world metaphor. for example, the idea of showing a reticle moving along a line to represent "where you are in this song/movie" is a universal notation. i don't believe it was common before personal computers.
Various kinds of tensor networks might be an example. Wikipedia claims that Penrose's graphical tensor notation is from 1971. Its descendant, ZX calculus is from as late as 2008. Arguably the first tensor networks were Feynman diagrams though, and 1948 is before your cutoff of 1960. (Actually, now that I think about it, it's kind of funny that the infinite dimensional case came before the finite dimensional one here.)
Relatedly: string diagrams (with Penrose's tensor notation apparently being seen as a precursor)
I do think there's some innovation on notation, but it mostly happens with existing typographic symbols because extending typography is harder than it used to be. Previously, you could just come up with whatever you wanted because work started out hand written. Then you'd pay to get the printer to make whatever weird symbol you wanted for publication, or, if on a budget, come up with some weird approximation using simpler symbols.
It seems like it should be easier on computers, and in theory it is, but lots of things drive us towards making default choices. The worst of these is probably that Unicode is already full of some many symbols that LaTeX can render, so it's much easier to just pick some existing symbol rather than try to go through all the work of cooking up a new one.
I separately hope that there's some effect from computer code, too, where people are trending towards favoring longer symbols that more resemble descriptive function and variables names, which feel less like notation but are easier to read on first glance, even if they bear costs in efficiency once familiar with them.
Some conjectures:
Possibly one factor is that the evident versatility of using ASCII in nearly all programming languages (and also for stuff like LaTeX) made people less inclined to invent new notation.
Emojis is a major potential example, as shown by the fact that the Unicode standard has been considerably extended to include them. However it’s debatable that these are notations in the sense you mean (technical symbols presumably).
In avant garde music there have indeed been notations invented and to some extent adopted since 1960. Back then it was quite common for composers to devise new notations for obscure techniques etc in their own works, though there were usually existing (often better) ones, albeit not standardised. A 1974 attempt to set standards with a conference in Ghent only partially worked.
The rise of music notation software since the 1990s has increased standardisation, as composers now use such software (rather than pen & manuscript paper), which somewhat constrains what fanciful notations they can use.
Also dataflow diagrams seem to come from 1970-s: https://en.wikipedia.org/wiki/Data-flow_diagram
Although visual dataflow programming seems to go back to 1960-s: https://en.wikipedia.org/wiki/Dataflow_programming
But yes, a bit later than 1960, but my examples are still quite old.
Huh, I did a bit of a search, and indeed very few examples show up, even if we allow those right at the 1960 cutoff.
Siteswap notation for juggling is the most common, and dates to 1981. New tricks have even be discovered due to it
Chess's PGN notation is from 1993, even if FEN is like a century older. Allegedly it took until around the 1980s for Anglosphere chess publications to switch to predominately using algebraic notation, though the Germans were using it a century earlier and spread it to the Russians!
Rubiks cube notation exists, though, mostly it's just standardization on what letters are used for the obvious concepts like moves.
There's a common core of markdown notation used very often, stuff like asterisks for italics (and two for bold) and etc, which apparently came from informal usenet conventions.
Similarly, UML diagrams and the conventions of informal diagrams wauld date past 1960.
Commutative diagrams in category theary were probably invented pre 1960 - but Categories for the Working Mathematician dates to 1971. My guess might be that using them all over the place only became common after 1960? String diagrams and Penrose notation and etc. are past 1960 too.
Combinatorial game theory is from 1960 and some years after that. I don't think there's that much novel notation, but maybe the standard notations for game values maaaybe counts? Similarly you probably count BNF notation as before 1960, since it was invented in 1959. For other things around the cutoff, integer floor/ceiling notation comes from Iverson right before 1960.
I think there's a general bias in Western culture arising from the problems of physicalism that gets people to consider realist ontology not worth seriously pursuing. Notation is downstream from ontology. You need to commit to an ontology to develop notation to represent that ontology.
I think there's a general bias in Western culture arising from the problems of physicalism that gets people to consider realist ontology not worth seriously pursuing.
Can you elaborate?
In physicalism, only the things that are made up of mass are real. Autism for example is not made up of mass, thus is less real from the physicist perspective than a chair. Realist ontology does treat autism as something that can be real.
The more physicalist perspective has as one of the consequences that the DSM doesn't really ask "What's the underlying mechanism behind autism and how do we create an ontology that's true to the underlying mechanism" but what are the shared symptoms that clinicians can observe and how can have clinicians that disagree about underlying mechanism still have a shared term that they can use to communicate and justify their treatments to insurance companies.
Avoiding treating autism as something that's real with a specific underlying mechanism that you can describe but instead orienting your ontology based on the symptoms does put you into a bad situation to make sense of mental illness.
In the sequences on LessWrong you have a lot about epistomology but little about ontology. Most reader of LessWrong probably don't know what realist ontology happens to be. If you search LessWrong for Barry Smith you get only four hits. That's despite Barry Smith being an important philosopher in applied ontology that does affect what AI as deployed in the real world contexts like the current Iran war does.
This neglect of good realist ontology is downstream of physicalism and causes issues in many different cases and I would expect there to be more notation development if it didn't exist because notation is downstream from having an ontology for which you create the notation.
Rephrasing what you are saying to check if I am understanding: Conceptual progress has slowed down, because most research is bottlenecked on ontology. If we made progress on that, we would see more new notations. As an example you bring up the mental disorders, where people are more concerned about the politics of diagnosis, than understanding the underlying reason why "autism" is a thing (or how many distinct things are behind that label). I feel like the sequences actually are pretty great for the ontology stuff? Or at least I can't think of anything better I've read. Like noticing confusion is a great skill with no skill ceiling in sight. The sequence on words taught me a bunch about semantics that seems like it is important for that type of stuff. I am pretty curious what is up with this Barry Smith guy now though, so if you have some specific reading recommendations from him and maybe you can even pitch what particular skill with regards to ontology you feel better at after reading him that would be great.
Writing consists of language and also notations, systems of marks that communicate meaning in a specialized domain. Examples of fields with their own highly developed notation are music, mathematics, architecture, electronics and chemistry. There are also more minor types of notation, for example, welding, meteorology and finite state machines. Here's the question: all the notations I'm aware of were invented before about 1960. Over the past few decades, people have invented all sorts of fancy notations, but none of them have caught on in the applicable field. Why not?
Some answers: