Meta is already moving toward gesture-controlled glasses, and its CTO, Andrew Bosworth, recently told CNET that gestures would most likely be needed for any future pair of display-enabled glasses.
It’s bad enough on current hardware, but if future AR/XR glasses use that peripheral area for information that doesn’t obstruct your view front-and-center, I’d be left out in the cold.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results