The Knowm team attended this year’s NICE workshop at IBM Almaden, sponsored by Semiconductor Research Corporation, IBM and the Kavli Foundation. The conference was filled with a good sampling of the modern celebrities of neuromorphics and representative of the entrenched government, corporate and academic interests in the space, who are clearly starting to the feel the pressure from the machine learning space. 

Knowm Team NICE 2017

Knowm Team NICE 2017

The conference started with introductions from conference chair Murat Okandan and IBM-Almaden director Spike Narayan. Murat recalled HGWells World Brain and asked the question: What are we going to use these tools for? Mr. Narayan was more pragmatic, discussing a direction toward computing at the edge, helping humans do their jobs better, and trying to draw a distinction between “Machine Learning” and “Machine Intelligence”. We found the ML/MI distinction arbitrary and indicative of IBM technology name-branding (ala “cognitive computing”).
NICE Highlights:

  1. True to DARPA’s continual drum-beat after the launch of the SyNAPSE program, another program (Lifelong Learning) is coming that aims to resolve the catastrophic forgetting problem found statistical learning methods such as neural networks. Program manager Hava Siegelmann gave an overview of her vision for the program which would involve three areas. (1) Software architectures, algorithms, theories, supervised and unsupervised reward-based learning. (2) Physical principles including learning from nature and transfer to machine learning. (3) Evaluation (by government team). 
  2. Karlheinz Meier presented for the Human Brain Project representing the funding agency and himself as performer (entrenched much?).  
  3. Kwabena Boahen gave an engaging presentation with great animations and revealed the neuromorphs five-point ‘secret’ master plan:
    1. Implement dendritic computation with sub-threshold analog circuits to degrade gracefully with noise.
    2. Implement axonal communication with asynchronous digital circuits to be robust to transistors that shut off intermittently.
    3. Scale pool-to-pool spike communication linearly with the number of neurons per pool, rather than quadratically.
    4. Distribute computation across a pool of silicon-neurons to be robust to transistors that shut off intermittently or even permanently.
    5. Encode continuous signals in spike trains with precision that scaled linearly with the number of neurons.
  4. Sankar Basu gave an overview of NSF’s efforts in the Nanotechnology Initiative, BRAIN Initiative, and National Strategic Computing Initiative (NSCI). Of interest was a map showing the selected groups for the SRC co-sponsored E2CDA program, which was completely void of small businesses–America’s technology engine. This is due to SRC’s IP acquisition strategy, which requires all program-generated IP, including all background IP, to be made available to SRC-member companies in perpetuity. This is a good example of government-corporate partnerships that only benefit the big corporations (in this case SRC member companies).
  5. Robin Degraeve, a researcher at IMEC, gave an overview of an interesting RRAM based processor for unsupervised temporal predictions that attempts to work around the stochastic properties of oxygen vacancy RRAM. 
  6. Sandia National Labs was heavily represented, with six speakers talking about “Spike Temporal Processing Units, neuromemristive hardware, Hippocampus-inspired adaptive algorithms and game theory.
  7. Joint Stanford and Sandia National Labs work to achieve low-voltage artificial synapses with a three-terminal technology they call ENODE, or more verbosely poly(ethylenedioxythiophene):poly(styrene sulfonate) modified by polyethyleneimine vapor. The selling point here is that the energy barrier for state retention and state modification are decoupled, achieving low switching energies while maintaining non-volatility. This three-terminal technology has its roots in the ‘memistor’ work by Bernard Widrow in the 1960’s. 
  8. Jeff Hawkins gave an energetic talk about his new idea that all cortical elements are encoding allocentric information, while avoiding a detailed description as to how this allocentric information comes to be. Hawkins claims:
    1. All sensory modalities learn 3D models of the world.
    2. All knowledge is stored relative to external (allocentric) reference frames.
    3. It is not possible to build intelligent machines without these properties.
Jeff Hawkins NICE 2017

Jeff Hawkins NICE 2017

  1. A very tired Ruchir Puri gave an overview of the IBM Watson system covering four levels: Cloud and infrastructure services, content, cognitive compute, conversation applications.
  2. Narayan Srinivasa (former HRL program manager for DARPA Synapse, PI and UPSIDE programs) discussed the relationship of energy and information from a neuroscience perspective. Of interest was his review of the work of Levy & Baxter and discussion of energy efficient spike codes. The idea is that depending on the ratio of active to fixed energy costs, the optimal spike code changes.
  3. Wolfgang Maass gave an inspired talk about synaptic genesis and how common models on neural network learning are incomplete or just plain wrong. 
  4. Tarek Taha and Miao Hu from HP Labs each presented their work in neuromemristive processors and devices, showing great progress on device development and memristor pair synapses.
Miao Hu NICE 2017

Miao Hu NICE 2017

While Knowm did not present at NICE this year, Alex was invited to speak at Mentor Graphics the very next day. In the following video, Alex discusses various aspects of AHaH Computing including motivation & history, basic building blocks and mechanics, stem-cell logic and pattern recognition and the emerging open-source software tools for VLSI chip design and simulation. Alex ends the talk showing incremental conductance changes with Knowm memristors and the new Knowm Memristor Discovery tool.

Related Posts

Subscribe To Our Newsletter

Join our low volume mailing list to receive the latest news and updates from our team.

Leave a Comment

Knowm 32X32 Crossbar

Knowm Newsletter

Are you ready for memristor AI processors? With our newsletter, you will be.