Joseph Paul wrote:
Yeah that idea got me thinking about was actually needed of the brain wrangling multiple drones. 

Attention and consciousness are relatively narrow bandwidth, as the psychologists have shown.
So we're looking at real-time strategy game style unit management. "Go to co-ordinates (X, Y)" rather than "Thrusters 60%, roll x for y seconds, pitch n degrees..."

A lot of training for our hypothetical drone wranglers, as well as many layers of safety built into the interface is going to be required to prevent crippling hallucinations or lethal biofeedback.

Was this supposed to be pipe directly to the brain or played over the eyes with lasers?
Use as many channels as possible.

Some potential Inputs/"Displays":
- vision
- hearing
- touch: vibration, pressure, pain, itch, temperature
- proprioception (joint position sense, body orientation) 
- acceleration - semicircular canals
- taste
- smell

Some potential "control channels":
- small muscle movements - extra-ocular, blink
- (micro)vocalisations
- touch mapping e.g. tongue to teeth/gums, skin as touch pad (scratch an itch, fire a missile?)

 Really what the hell do you need to be networked to other people's senses for as a child?

You don't - but stuffed toys and action figures you can control with your mind will likely sell like hot cakes.
The more feedback they can provide, well - train the children with play.

There are lots of applications for the technology: industrial, civil, military.
This is what I think the Tech Level 13(? - MegaTrav expanded TL lists) brain interfaces look like in broad outline.



Rob O'Connor