This analysis does not mention cancer, and cancer's identification as a "modern disease" is greatly confounded by the fact that civilization has brought both increases in lifespan (revealing cancer; breast cancer rates in women are far higher above age 60 than below) and great increases in the ability to detect cancer (and thus link deaths to it). An excellent discussion of these issues is found in Siddhartha Mukherjee's Cancer, the Emperor of All Maladies.
You stated that when "hunter-gatherers switch to eating modern diets they start getting these diseases at high rates." What evidence is there for this claim in particular for cancer?
Currently, my firm and its allies are trying to push the government into forcing the schools to use a Bayesian prediction model, in which you feed an individual student's test scores for the past 5 years and it spits out their probability of success in the advanced classes, and you keep putting the students with the highest probability of success in the top classes until you run out of teachers.
This is good, and I hope that such models are implemented. However, when I hear the phrase "problems in education," these sorts of placement problems are not what comes to mind first.
Having personally taught at a massively failing inner city high school for several years (where only 2% of students were white, and only 10% met state goals for education), the few "advanced classes" that my school offered were, indeed, filled by students who had achieved top scores in a variety of metrics (top, as compared to the other students at the school). I taught such an advanced class, as well as several general placement classes. The administration assigned students to each class without using a Bayesian model, but I honestly don't think the resulting student distributions would have changed much if they had used one.
The problem was never making sure that the students with the highest probability of success made it into the advanced classes; my administration, for whatever its other failings, had mostly solved that one. The consuming, stultifying problem was that in my advanced 10th grade classes, only a few of my students could read even at an 8th grade level. The situation was even worse in my general classes, where most students read at a 6th grade level.
If you constrain it to an 8 hour spread, it does indeed help things - you'd only need around 34 people agreeing to commit to 1 hour, so even more optimistic than your guess. And if we do get people to coordinate smaller specific time slots, perhaps convincing 25% to take a 1.5 hour slot and the rest to commit a minimum of 1 hour, this moves things closer to only needing a group of 30. Not too bad.
To estimate how much of an underestimate that was, I wrote a very short program to simulate this scenario. From my model, we cross over to 80% at about 85 people. Incorporating a random spread in how long people are logged in, from 0.5 to 1.5 hours, doesn't change anything.
I am not sure how many people you could get to sign up, but the fewer you get, the more hours they'd have to commit. From my model, if you can only get 60 people, they'll need to work on average 1.5 hours; for 45 people, you'd need them to commit to 2 hours.
The numbers don't look too good. Even 60 people, with an average commitment of 1.5 hours, seems like a challenge. Maybe the LW community could meet it?
I highly recommend you look into the work of Joao Pedro de Magalhaes, who is doing diverse and interesting work in aging bioinformatics and aging science in general. Some excellent recent papers: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4203147/ http://www.nature.com/nrc/journal/v13/n5/abs/nrc3497.html and http://www.tandfonline.com/doi/abs/10.4161/15384101.2014.950151