The speed at which electrical signals propagate is much faster than the speed at which electrons move in an electrical conductor. (Possibly helpful metaphor: suppose I take a broomstick and poke you with it. You feel the poke very soon after I start shoving the stick, even though the stick is moving slowly. You don't need to wait until the very same bit of wood I shoved reaches your body.)
The speed at which electrical signals propagate is slower than the speed of light, but it's a substantial fraction of the speed of light and it doesn't depend on the speed at which the electrons move. (It may correlate with it -- e.g., both may be a consequence of how the electrons interact with the atoms in the conductor. Understanding this right is one of the quantum-mechanical subtleties I mention below.)
When current flows through a conductor with some resistance, some of the energy in the flow of the electrons gets turned into random-ish motion in the material, i.e., heat. This will indeed make the electrons move more slowly but (see above) this doesn't make much difference to the speed at which electrical effects propagate through the conductor.
(What actually happens in electrical conductors is more complicated than individual electrons moving around, and understanding it well involves quantum-mechanical subtleties, of most of which I know nothing to speak of.)
It is not usual to convert AC to DC using relays.
It is true that if you take AC power, rectify it using the simplest possible circuit, and use that to supply a DC device then it will alternate between being powered and not being powered -- and also that during the "powered" periods the voltage it gets will vary. Some devices can work fine that way, some not so fine.
In practice, AC-to-DC conversion doesn't use the simplest possible circuit. It's possible to smooth things out a lot so that the device being powered gets something close to a constant DC supply.
But there are similar effects even when no rectification is being done. You mentioned flickering lights, and until recently they were an example of this. If you power an incandescent bulb using AC at 50Hz then the amount of current flowing in it varies and accordingly so does the light output. (At 100Hz, not 50Hz; figuring out why is left as an exercise for the reader.) However, because it takes time for the filament to heat up and cool down the actual fluctuation in light output is small. Fluorescent bulbs respond much faster and do flicker, and some people find their light very unpleasant for exactly that reason. LED lights, increasingly often used where incandescents and fluorescents used to be, are DC devices. I think there's a wide variety in the circuitry used to power them, but most will flicker at some rate. Good ones will be driven in such a way that they flicker so fast you will never notice it. (Somewhere in the kHz range.)
Sometimes DC (at high voltages) is used for power transmission. I think AC is used, where it is used, because conversion between (typically very high) transmission voltage and the much lower voltages convenient for actual use is easy by means of transformers; transformers only work for AC. (Because they depend on electromagnetic induction, which works on the principle that changes in current produce magnetic fields and changes in magnetic field produce currents.) I don't know whether AC or DC would be a better choice if we were starting from scratch now, but both systems were proposed and tried very early in the history of electrical power generation and I'm pretty sure all the obvious arguments on both sides were aired right from the start.
When a device "consumes" electrical energy it isn't absorbing electrons. (In that case it would have to accumulate a large electrical charge. That's usually a Bad Thing.) It's absorbing (or using in some other way) energy carried in the electric field. It might help to imagine a system that transmits energy hydraulically instead, with every household equipped with high-pressure pipes, with a constant flow of water maintained by the water-power company, and operating its equipment using turbines. These wouldn't consume water unless there were a leak; instead they would take in fast-moving water and return slower-moving water to the system. An "AC" hydraulic system would have water moving to and fro in the pipes; again, the water wouldn't be consumed, but energy would be transferred from the water-pipes to the devices being operated. Powering things with electricity is similar.
Why 50/60Hz? It has to be too low to be heard, to high to be seen, high enough for transformation, low enough for low induction losses, low enough for simple rotating machines. Trains can not use 50/60 so they went with 1/3 (16+2/3 Hz or 20 Hz)
Grid frequency is controlled to +-150mHz if that fails private customers might get disconnected/dropped.
The time derivative of the grid frequency is a measure of the relative power mismatch.
True, I had not claimed that all criteria could or have been met. Because of the noise and the heat I just the other day replaced the inductive load in some of my very old but still fully functioning kitchen counter lights, with modern switching current regulators. The 50 Hz produce a 100 Hz tone that had been bothering me for decades. But even some of those can be heard by some people. (Not me I am deaf to anything >10kHz)
It is a compromise in an area of sensory overlap but the human senses are not equally sensitive to all frequencies. Your hearing is way better at 3kHz. At your age you will still remember CRT monitors that would operate at 60 Hz at max resolution, bad but they did get used.