An Effective Grab Bag
Preface I've been exploring low cost ways to increase resilience in the AI safety community as part of my work on Alignment Continuity; grab bags are one such intervention. Since the Russia-Ukraine war kicked off, I've had a few EAs ask me about this kind of thing and so I decided that instead of just making a bag I'd make a guide too. Thanks to everyone that helped and offered feedback and suggestions. Overview Banana for scale, not emergencies. Grab bags are a tool for increasing your resilience in emergencies. Not to be confused with 'bug out bags' - they are not intended to be your sole resource for several days/weeks spent living in the woods - it helps to think of a grab bag as an emergency toolkit. A grab bag is defined as much by where it is as by what's inside it - it's something you have with you, or nearby, most of the time; it's something you can grab with only a moment's notice.[1] > For example, if there's a fire you should be able to reach your bag without delaying your exit from the building. A few good places to keep a grab bag: * Your car * Your home office * By your front door Contents and Packing The contents are organized into several bags[2] - a first aid kit, [yourstuffhere], hygiene kit, utility kit, food and water, and spare clothes (also yours). The bags are packed in a specific order so everything is easily accessible and the first aid kit is the first thing you see when you open the bag. Unpacking the bag, also gif version On the outside of the bag I've added a small LED dongle for ease of access in dark environments and a USB flash drive. Packing the bag, also gif version Your Bag The Bag A rucksack or backpack is a good option, but I've chosen to build around a drybag - highly waterproof, rugged and much cheaper than a waterproof backpack. All drybags are more-or-less created equal - this one comes with reflective patches, plenty of attachment points, and a shoulder strap. Size 20 Litres is the sweet spot for a o

Totally. Asked only to get a better model of what you were pointing at.
And now my understanding is that we're mostly aligned and this isn't a deep disagreement about what's valuable, just a labeling and/or style/standard of effort issue.
E.g. Symmetry Theory of Valence seems like the most cruxy example because it combines above-average standard of effort and clarity of reasoning (I believe X, because Y, which could be tested through Z), with a whole bunch of things that I'd agree pass the duck test standard as red flags.