Simeon Bamford

Neuromorphic Engineer

I find neural systems fascinating, and I love to engineer them. I have decades of experience with AI and its applications.

I've worked from machine learning and computational neuroscience, through neuromorphic sensors and processors, to prosthetic systems and wet biology.

In particular, I've developed lots of know-how as a neuromorphic engineer, creating electronic circuits which mimic computation in nervous systems. These circuits, whether integrated on silicon chips or thin flexible sheets, often use electrical currents to imitate currents in our nerve cells and brains. I'm therefore skilled at mixed-signal (analogue and asynchronous digital) chip design and test (VLSI - CMOS, CIS), as well as PCBs, FPGA logic and software, from algorithms to applications.

As an entrepreneur and technologist, I've been involved in the commercialisation of neuromorphic vision sensors, also known as event cameras or dynamic vision sensors. This has taken me into dozens of commercial R&D departments and I've developed an unparalleled insight into the current attempts of many major companies to adopt neuromorphic technology. I've been mixing 3D geometrical algorithms and deep learning models to achieve perception from event cameras.

I've also worked on cloud deployment (AWS) of deep learning models for natural language processing; I've learnt Tensorflow, Lambda serverless compute and Gremlin graph database. This high-level focus on practical applications of neural networks has complemented my understanding of their low-level properties.

Most recently, I've been exploring circuits for event-based tactile sensors, and I'm fascinated by the possibilities that flexible electronics (transistors printed on thin flexible sheets) may have in combination with neuromorphic design.

I see huge potential in neural computation, AI and robotic technology to assist us and improve our lives. I'm also very curious to understand more about how our nervous systems allow us to function, and how our mental lives arise.

Projects (past and present)

Key publications

(see also google scholar)

Event-based tactile sensors and flexible electronics (2020-present)

I've been exploring design possibilities at the junction between neuromorphic engineering and flexible printed electronics. I've been designing for IGZO transistors and OTFTs.

Event-based visual perception (2019-2021)

At the Italian Institute of Technology (IIT) I continued my work with event-based sensors. I looked at how to fuse 3D geometrical algorithms with deep learning for perception, bringing together my in-depth knowledge of dynamic vision sensors and spike-based computation with my experience of large-scale deep learning models.

Publications

Deep learning for natural language processing (2018-2019)

I worked with a financial services provider to develop and train models for natural language processing (NLP) and deploy them in a cloud architecture.

I worked with a number of technologies, including: SpaCy for NLP; Tensorflow for machine learning; Gremlin for graph DB; as well as the AWS stack, including Lambda, EC2, S3, Cloud Formation, Code Pipeline, etc.

Deep-learning with Tensorflow 2 and Keras, A.Gulli et al

I was a technical proof reader for "Deep-learning with Tensorflow 2 and Keras".

Dynamic vision sensors and other neuromorphic hardware (2013-2017)

Dynamic vision sensors (DVS) see the world not as a series of frames, but as a set of changes occurring at individual pixels. Information about change is transmitted asynchronously and with very low latency. Data rates can be very low in some applications.

In my role at iniLabs GmbH and then iniVation AG, I worked on the design of third-generation DVS as part of the EU's SEEBETTER project, and as part of DARPA's SyNAPSE project, gaining deep insight into IBM's TrueNorth technology. Other neuromorphic hardware arising from INI include the Dynamic Audio Sensor of Dr. Shih-Chii Liu and students, and the Dynamic Neural Asynchronous Processor (Dynapse) of Prof. Giacomo Indiveri and students.

I oversaw the marketing, sales and support of these prototypes into hundreds of organisations, including both academic and commercial R&D departments, and I worked with several organisations to develop algorithms and applications for these technologies. I've therefore gained an unparalleled insight into the attempts of many major companies to adopt neuromorphic technology.

Here are a couple of videos from Inilabs, which I worked with a colleague to create, which show some of the output of these sensors:

 

This article from IMVE magazine Dec 2017 gives a good overview of the state of play by the end of my involvement.

Here's Dharmendra Modha's IBM True North team photo:

Publications

Closed-loop brain prosthesis (2009-2011)

The aim of the ReNaChip project was to create a chip which could be implanted in a brain replacing one function of the brain in performing a learning task. The chip I designed takes neural signals amplified from electrodes, processes them to detect events, and then implements a model of cerebellar classical conditioning. It does this by means of a field-programmable array of mixed-signal components specialised for neural signal processing and neural modelling. My interest in field-programmable circuitry was enhanced by a brief contract at Edinburgh University on a project to create a related design specialised for neuromorphic applications. The ReNaChip was used to rehabilitate eyeblink conditioning in an anaesthetised rat.

Publications

Neuromorphic synaptic rewiring for topographic map development, including work on STDP (2006-2009)

For my PhD at the University of Edinburgh I worked on an alternative method for delivering events within neuromorphic systems made of many silicon chips.The events represent spikes (the electrical pulses that brain cells use to communicate with each other). These events were broadcast across each chip and could be received simultaneously by many synapses (the connections between neurons) - this reduced a speed bottleneck present in other designs.

I also implemented the formation and elimination of synapses (a process which happens continuously in our brains, known as "synaptic rewiring" or "structural plasticity"). I then used synaptic rewiring to model the development of topographic maps (ordered sets of connections between different brain areas).

As a learning rule, the model used a type of spike-timing-dependent plasticity (STDP). I created an analogue silicon circuit to emulate this learning rule, which allowed some control of the weight-dependence of the plasticity. A serendipitous result that emerged was that the learning rule automatically provided some compensation for inhomogenities in the chip design.

Publications

Planar patch clamp (2005)

During my MSc I worked on a project testing an experimental device (a planar patch-clamp chip) for electrical recording from biological nerve cells. This project gave me experience with the patch-clamp technique as well as some silicon clean-room experience.

Contact

simbamford@gmail.com

LinkedIn