Apple has done the thing it does — waited for everyone else to get it slightly wrong, then walked out and done it properly. At a special event Tuesday at its Cupertino campus, the company unveiled NeuralType, a slim over-ear device that reads electrical signals from the brain's language centres and converts them to text with what Apple claims is 98.3% accuracy.

The demonstration was striking. CEO Jordan Mercer stood on stage, hands at her sides, eyes closed, and narrated a 200-word passage from a novel entirely in her head. The words appeared on the screen behind her in real time, punctuation and all. The audience, which included journalists who cover Apple for a living and are professionally resistant to being impressed, fell silent.

"We didn't want to put anything in your head. So we didn't."— Jordan Mercer, Apple CEO

Unlike rival neural interface companies that require surgical implantation, NeuralType is entirely non-invasive — worn like a pair of headphones, powered by a custom silicon chip Apple is calling the N1. The device works by detecting and interpreting the faint electrical signals the brain generates when constructing language, even before speech muscles are engaged.

Privacy researchers immediately raised concerns about a device that essentially reads thought-language. Apple was prepared: the company announced that all neural processing happens on-device, with no data leaving the chip, and introduced a hardware kill switch — a physical button that disables the sensor array entirely. "We didn't design a window into your mind," Mercer said. "We designed a keyboard you don't have to touch."

NeuralType will launch in beta for accessibility users in September, priced at £349, with a full consumer release expected in early 2027. Regulators in the EU and UK have already indicated they will be watching closely.