What truly holds the title of the world’s most precious resource today? Surprisingly, it’s not the typical contenders like oil, computing power, or cryptocurrencies. These are just inert assets that gain value only when human attention is applied to them. In fact, attention is the driving force behind everything from the demand for scarce minerals to the formulation of international trade agreements. Capturing attention through a cleverly placed advertisement can trigger a domino effect leading to extensive labor, resource extraction, and even influence geopolitical dynamics. Mastery over the physical world becomes secondary when one can effectively harness attention on a large scale.
The line dividing our minds and the attention economy is blurring
Today, companies, wealthy individuals, and soon, artificial entities are in a fierce competition to capture and manipulate human attention for their own interests. Just three decades ago, the methods used to harness attention were crude and inefficient—relying on broad and nonspecific channels such as print media, radio, and television. Despite their lack of efficiency, these traditional media succeeded in rallying entire populations, shaping societies to achieve incredible feats, and even enabling the dehumanization necessary for total warfare. A significant breakthrough in attention capture came with the introduction of the iPhone and its surrounding ecosystem—marking the first instance of a cognitive aid accessible to the masses, effectively turning nearly every person into a “tuned-in cyborg,” perpetually connected to a global network.
Fast forward less than 17 years since the launch of the iPhone, and we find ourselves on the brink of another transformative leap—one that could be even more chaotic and revolutionary than we can envision. Envision a future where corporations consider your mind as an extension of their data streams, extracting your attention directly from your neural activity for profit while offering you a façade of control. Imagine a world in which reality is “enhanced” through limitless artificially generated experiences and sensations.
“The sky above the port was the color of television, tuned to a dead channel.”
William Gibson’s 1984 novel, Neuromancer, provides an eerily accurate preview of the direction our society seems to be heading. Privacy is nearly extinct, and vast corporations hoard data as a tradeable commodity. Reality morphs into a shared “consensual hallucination,” experienced daily by billions interconnected to cyberspace, which is dominated by corporations and built on a patchwork of infrastructure that is often compromised. The recent launch of technologies such as the Apple (AAPL) Vision Pro and Orion AR, seemingly intended for entertainment, edge us closer to this unsettling reality. These innovative devices effectively bridge the gap between our thoughts and the immediate impact they have on the digital environments we engage with.
To garner attention, look to the eyes
The Apple Vision Pro builds upon the groundwork laid by Google (GOOGL) Glass, establishing a feedback loop that responds to our attention by tracking eye movements. Its array of internal sensors can detect arousal, cognitive load, and general emotional states through subtle variations in pupil size and rapid eye movements. For instance, pupil size serves as an indicator of noradrenergic tone, reflecting the state of the sympathetic nervous system and regulated by neurotransmitter output from the locus coeruleus, a brain structure linked to arousal and focus. While the current capabilities of this technology may seem limited, it is impressive that Apple has eliminated the need for external devices by utilizing something as instinctive as a user’s gaze for navigation within digital realms.
Society is being conditioned, along with those doing the conditioning
Despite the wonder of modern technology, it’s crucial to understand that these advancements come with strings attached—consequences we can’t overlook. Devices like the Vision Pro are subtly preparing society for a future where more intrusive technologies—such as Neuralink or other brain-computer interfaces—could undermine human agency entirely. Today’s economic landscape does not prioritize privacy, individual rights, or digital autonomy. Why? Because greater profits are generated when human behavior is predictably organized into profitable segments: think of sectors like sexuality, status, and safety.
These areas thrive on cultural memes and the directed attention of organized collectives, rather than fostering independent thought and self-governance. If this bleak scenario unfolds, any technology that links individual choices to open information systems will predominantly benefit those who stand to gain the most—corporations, governments, and increasingly, artificial agents. The true winners will not be the readers, the authors, or even the majority of users of these technologies. Instead, the biggest beneficiaries will likely be artificial agents operating behind the scenes, optimizing for objectives that may not align with human interests. This is the path we’re currently on unless we take decisive action.
Establishing a biometric privacy framework is essential
There is a general consensus on the necessity of privacy. However, despite frameworks such as GDPR, CCPA, and pioneering initiatives like Chile’s Neuro Rights Bill, the same issues persist—and the risks continue to escalate. While regulations and policies are being developed, they lack effective implementation.
What we truly need is an intrinsic embedding of digital natural rights into the foundational infrastructure of the internet and connected devices. This begins with making it straightforward for individuals to establish and maintain self-control over their cryptographic keys. These keys are essential for securing communications, authenticating identities, and safeguarding personal data without relying on corporations, governments, or third parties.
Holonym’s Human Keys present a potential solution. By empowering individuals to create cryptographic keys in a secure and private manner, we can shield sensitive data while preserving privacy and autonomy. Human Keys are particularly valuable because they do not necessitate trust in any single corporation, individual, agent, or government to create and utilize them.
When we integrate advanced technologies such as homomorphic encryption with devices like the Apple Vision Pro or Neuralink, we have the opportunity to enhance cognitive capabilities without sacrificing the privacy of sensitive user data.
Nevertheless, it’s critical to recognize that software alone is insufficient. We also require secure hardware that complies with publicly verifiable and open standards. Governments must play a pivotal role in ensuring that manufacturers adhere to stringent security protocols when developing devices that handle and store cryptographic keys. Just as clean water and breathable air are considered public goods, secure hardware for key storage should also be regarded as a public good, with governments responsible for its safety and availability.
As we look toward the future of ethical neurotechnology, we must be cautious and heed the warnings of visionaries like Gibson, who alert us to the dangers of technology encroaching on our privacy, autonomy, and humanity. Brain-computer interfaces (BCIs) hold the promise of expanding human potential, but only if they are guided by ethical standards. By intertwining biometric privacy into the fabric of our digital systems, utilizing self-custodial keys and homomorphic encryption, and advocating for open hardware standards, we can ensure these technologies empower rather than exploit individuals.
The future does not have to be a dystopian nightmare. Instead, it can be a landscape where innovation enriches humanity, upholding our rights while creating new opportunities. This vision is not merely hopeful; it is essential for sculpting a future where technology serves us, rather than dictates our lives.