was a California attorney (1975-2014) with an engineering education (B.S., UCLA; M.S., Caltech) before law school at USC and a member of the Law Review. Nick Brestoff. He is the sole or lead co-inventor of seven patents that use deep learning in the claims and the first issued patent that uses deep learning and blockchain in the claims, all issued in 2017-18, and assigned to the company he founded, Intraspexion LLC.
I’m fascinated by emerging technologies. I searched “emerging technologies” on Wikipedia, and found a main article, “List of emerging technologies,” and then a related set of examples. The examples were Artificial Intelligence (“AI”); 3D Printing; Cancer vaccines; Cultured meat; Nanotechnology; Robotics; Stem-cell therapy; Distributed ledge technology (i.e., blockchain); and Medical field advancements. I’ve already written about six categories of emerging technologies: AI in the form of deep learning on September 30, 2021; blockchain on November 9, 2021; quantum computing on November 20, 2021; and then three more categories: stem cells, robot, and edge computing on December 11, 2021…. For this article, I decided to investigate three additional categories: (“3D Printing” or “Additive Manufacturing,”) (“Genetic or “gene therapy” ) and Nanotechnology (“Nano”), and compute a bar chart for all nine categories.
Previously, I’ve written about patent trends for the emerging technologies of deep learning, blockchain, and quantum computing. This article shifts the focus to the realms of biology (“stem cells”), machinery (“robot”), and then back to computing, and specifically to a topic suggested by a reader (with the handle “Primary Examiner”), “edge computing.” In each instance, the bar chart tells the same story.
Towards the end of 2019, I was finishing a book, AI Concepts for Business Applications. The last chapter was titled, “The Future.” I wrote about quantum computing and a version of deep learning that was related: a “quantum walk neural network.”In 1980, the idea of a quantum processing unit was proposed. Such a processing unit doesn’t use the 1s and 0s with which we’re familiar. That “classical” way of thinking is the way we think, with a 1 for true and a 0 for false, and combinations—for example, a “false positive.” Quantum computing is based on a “superposition” of states called “quantum bits” or “qubits” for short. But there’s a big difference between the way we think and the way nature behaves. In 1981, the late Caltech professor, Richard Feynman (a Nobel Prize co-winner for his work with “quantum electrodynamics”) summed it up: “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.” Now, quantum computing is beginning to emerge.
Blockchain’s history begins in 1991, when Stuart Haber and W. Scott Stornetta published a paper describing a cryptographically secured chain of blocks. It took another 18 years before a developer who called himself Satoshi Nakamoto released a white paper that established the model for a blockchain and then, a year later, implemented the first blockchain as a public ledger for transactions using bitcoin. The engine that runs the bitcoin ledger that Nakamoto designed is called the blockchain; the original and largest blockchain is the one that still orchestrates bitcoin transactions today. Blockchain technology was separated from currency in 2014, and that advance opened the door for using blockchain for applications beyond currency. The standout example is the Ethereum blockchain system, which introduced computer programs in a blockchain format, representing financial instruments such as bonds. These became known as smart contracts.
In 2015, I spotted what I thought might be an emerging technology: deep learning. Because of my engineering education, I was able to go up the “deep learning” curve. The term “deep learning” is the current name for a “deep neural network,” which was previously called a “multi-layer neural network.” While our organic brains are filled with approximately 86 billion neurons, the “deep learning” quest was built on mathematics and Graphics Processing Units (GPUs). It seemed like a breakthrough. With enough examples of a category, could a deep learning model assess data that the model hadn’t seen previously, and then score, rank and report the “matches” to a user? In short order, I was persuaded that the answer was yes.