As I recall, Docker itself is a rather "hacky" solution, with lots of technical debt embedded in it. I doubt that AWS Batch itself is much better. In these domains, applying a "systemizing" approach to only your piece of the puzzle is not really feasible. If you're really committed to the systemizing outlook, your choices are (1) yak shaving - set the original goal aside and start paying down technical debt in e.g. Docker itself until the result is simple enough to understand in a systematic way. (2) if you can't adopt this approach, because some of the complexity you're dealing with cannot be feasibly reduced (or you simply don't care about doing that sort of work), then just try to abstract away what's 'complex' and 'hacky' about the original subsystem, and build a simpler interface that suffices for yoir goals and avoids 'leaking' the underlying complexity as far as practicable.
Both of these are in some sense high-cost, high-reward activities; hpwever, the 'hacky' approach comes with very real costs of its own (i.e. it provides little or no means of avoiding breakage of various sorts as complexity increases), so the common view of it as accruing technical debt that will need to be serviced or paid down later seems right to me.
Let's call two broad clusters of problem-solving "systemizing" and "hacking." Systemizing roughly refers to building general, robust solutions. Hacking refers to quickly building one-offs. You would expect to accrue technical debt while hacking and pay it down while systemizing.
Both are useful in different situations. A big win for systemizing would be finding patterns in code and dramatically reducing the complexity by factoring out those patterns. A big win for hacking would be quickly testing several UIs to find one that potential customers love.
As you can probably guess, I lean very heavily towards systemizing, and have typically sucked at hacking – I generally relied on coworkers for that skill. But I started working on solo projects where I couldn't easily rely on someone else, so naturally, I developed a system for hacking things together.
The concrete example here is that I wanted to be able to run thousands of completely independent machine learning jobs quickly in parallel across a lot of computers, and I was looking into AWS Batch to do this.
Here's how I would have approached this in "systemizing" mode:
Here's how I did this in hacking mode:
This was really effective – previously, I might have spent ~40 hours deeply learning all the tools involved before I got the system up and running. Using the hacking approach, it only took me about 3 hours – literally an order of magnitude faster.
Of course, the hacking approach has clear disadvantages – I don't deeply understand Docker or any of the tools involved, and my experience here won't save me a ton of time if I work on a similar project. But in this case, I don't expect to do anything similar in the near-term, so hacking was a huge win. More generally, having the skill of hacking things together gives me many more options for projects in the future.