Just an opinion: ideas do not come from nothing, so the larger the data pool (memories, experiences, interests) the more ideas are likely to be generated.
It very much seems like we live in an age of hyperspecialization; people know very much about relatively few things. Generally, these areas of knowledge are complimentary or related. Sometimes they overlap outright. Life is barely long enough to get good at one thing, so people often choose to specialize early and stay on one very fixed path.
From the outside looking in these collectives of experience are very tribal. They develop their own languages and symbols. They become these closed systems where ideas aren't created as much as they are simply refined; bounced back and forth among tribal members. But this does not seem to be a very good pattern for long term growth or sustainability. Homogenization leads to extinction.
What I mean by that can be understood by looking at the evolution of life on Earth as an example of the obverse. Evolution tends towards diversity. Diversity gives life its best possible chance of success. That way, when an asteroid slams into the planet, not everything dies. If evolution had tended towards homogenization (only making the best dinosaurs possible) instead of diversity, the K-T Event might have turned this planet into a floating rock.
It may be a bit of a weak analogy, but I feel like the same principles might apply fairly well to specific areas of knowledge. Ideas are the mutations that allow knowledge to change and evolve into something new. Exclusivity and specialization are a sort of homogenization that leads to stagnation and fewer truly new & good ideas. Not that ideas don't happen at all, just that maybe they happen less often than they should... or could. I don't know, really. This is mostly just speculation based on personal observation and opinion.
Colloquially, I can say that the people I have known in my life who seem to have the most ideas are the ones whose interests are all over the map, so to speak. They tend to be older, with a deeper well of experience to draw from. Their knowledge pools, being varied as opposed to complimentary, allow them to look outside these otherwise closed systems and make inferences, or to see patterns that people too mired within the subject matter might easily miss.
They may not always be good ideas, but they are often striking in their seeming originality and unexpectedness.
An example that comes to mind is of a family friend who worked for years in automotive manufacturing before going back to school to get his certification as a laboratory technician. He got a job as a lab assistant at a University research hospital. He would overhear the researchers in the break room talking about their current projects, and one of them that really grabbed his interest was the problem of infectious disease control measures, specifically, getting healthcare professionals to wash their hands between patient interactions. He had the idea, based on his experience in manufacturing, to apply Poka Yoke (a Japanese manufacturing term that roughly means error-proofing) to the problem of getting nurses and doctors to wash their hands between patient encounters. His idea was to install sink-locks at all the doors to patient rooms. These doors would only open from the outside if the sink was used for at least 20 seconds immediately prior to opening them, or if an emergency button was pushed. From the inside they open at will. He mentioned the idea in casual conversation with one of the senior researchers who was so excited by it that he wanted to design a study around the concept.
I feel like there is a potential benefit to be had by looking outside as opposed to focusing too intently within. Maybe spending some percentage of time learning about completely new things as opposed to only endeavoring to learn new details about things we already know might yield an increase in new ideas. There's nothing wrong with getting out of our comfort zone and challenging our perspectives.
I was thinking in a very different direction upon reading "a lot of people also find that writing down your ideas, causes you to have even more ideas." I know what you mean in the context of a reinforcement system, but I think it misses the more pressing phenomena, at least in my experience of uncertainty whether i'm inventing or indulging, of working on ideas.
The "even more ideas" part sounds to me like a sort of (combinatorial) explosion, when my stroke of inspiration is much more problematic, much less elegant than I thought. Sometimes this also means much less original than I thought, but this isn't a bad thing-- convincing oneself that something is being discovered is often the most effective way at grokking it! You don't really lose anything when you find out that it's, in fact, old news.
Sometimes all this means is it will take more work than I thought, to follow the idea through. Other times it means this is the wrong rabbit hole.
But I think many of us AD(H)Ders develop a suspicion or even hostility to this "indulgent" signal, the internal phenomena of believing oneself to be creative, because they've too easily looted the reward (been superficially creative) at the expense of rigor ("why finish all those exercises in that boring book when this system i just wrote down is totally AGI already?")
(At the same time, you and Lawrence Block are 100% right about nurturing/environment as well)
All in all, I can't wrap my head around "what is the difference between a producer and a consumer of thought?" because the question as posed seems to hold rigor, even quality, constant/irrelevant.
(Many years ago a composer told me that when Schoenberg was at UCLA, young composers had to spend hours with him for 3+ years just analyzing mozart before he would consider looking at your music, compared to conservatories now you're expressing yourself from day one. There is doubtless an analogy to AI risk-- which culture is more productive?)