In 2021, a British/Polish firm known as Walletmor announced that it had become the first company to sell implantable payment microchips to everyday consumers.
While the first microchip was implanted into a human way back in 1998, says the BBC News—so long ago it might as well be the Dark Ages in the world of computing—it is only recently that the technology has become commercially available (Latham 2022). People are voluntarily having these chips—technically known as “radio frequency identification chips” (RFIDs)—injected under their skin, because these microscopic chips of silicon allow them to pay for purchases at a brick and mortar store just by hovering their hand over a scanner at a checkout counter, entirely skipping the use of any kind of a credit card, debit card, or cell phone app.
While many people may initially recoil from the idea of having a microchip inserted into their body, a 2021 survey of more than 4,000 people in Europe found that more than 51 percent of respondents said that they would consider this latest form of contactless payment for everything from buying a subway Metro card to using it in place of the key fob to unlock a car door. (Marqeta/Consult Hyperion 2021).
In some ways, the use of RFID chips in this manner is merely an extension of what has been going on before; the chips are already widely used among pet-owners to identify their pet when it is lost. The chips come in many sizes and versions and are far more common than most consumers realize—they are sometimes sewn into articles of clothing so that retailers can monitor the buying habits of their customers long after a purchase has been made. And Amazon has now come out with its button-sized RFID chips, which it dubs “air tags”: Clip one onto your keys, and the air tag will help you find where you accidentally dropped them—as well as making it simple to track anyone, said the Washington Post in “Apple’s AirTag trackers made it frighteningly easy to ‘stalk’ me in a test” (Fowler 2021). All for less than $30 per air tag.
So, to some extent, human-machine products and the use of RFID chips is old hat; the underlying driver has always been the goal of expanding the abilities and powers of humans by making certain tasks easier and less time-consuming.
Consequently, such consumer technology can look like the next logical step—especially among those who already favor piercings and tattoos. But on second glance, the insertion of identifying microchips in humans would also seem to bear the seeds of a particularly intrusive form of surveillance, especially at a time when authorities in some parts of the world have been forcibly collecting DNA and other biological data—including blood samples, fingerprints, voice recordings, iris scans, and other unique identifiers—from all their citizens, in an extreme form of the surveillance state. Before deciding what to think of the tech, we ought to look under the hood, and find out more about some of the nuts and bolts of this hybrid human-machine technology.
Ahmed Banafa is an expert in new tech with appearances on ABC, NBC , CBS, FOX TV and radio stations. He served as a professor, academic advisor and coordinator at well-known American universities and colleges. His researches are featured on Forbes, MIT Technology Review, ComputerWorld and Techonomy. He published over 100 articles about the internet of things, blockchain, artificial intelligence, cloud computing and big data. His research papers are used in many patents, numerous thesis and conferences. He is also a guest speaker at international technology conferences. He is the recipient of several awards, including Distinguished Tenured Staff Award, Instructor of the year and Certificate of Honor from the City and County of San Francisco. Ahmed studied cyber security at Harvard University. He is the author of the book: Secure and Smart Internet of Things Using Blockchain and AI.