What is Cognitive Cybernetics?



So what is Cognitive Cybernetics and how does it make your business trounce the competition?

Cognitive cybernetics is the field of solving problems by simulating cognition and consciousness rather than simple pattern matching problem solving. It adds several aspects to common deep learning approaches such as ontic centers, simulated hippocampus and thalamus regions, spatial world map construction and memories, and problem solving via a conscious reason system (CRS). Neural language systems extends traditional NLP processing providing more intelligent responses in regular english.

Cognitive cybernetic solutions are self organizing with links extending connections based on growth computations. Groups respond in a neural darwinistic way to enhance successes.

It begins with work done in 1975 by Kunihiko Fukishima, Fukushima's breakthrough - the Cognitron - was unique in that it emphasized dynamic connections.
The synapse from neuron x to neuron y is reinforced when x fires provided that no neuron in the vicinity of y is firing stronger than y”. By introducing this hypothesis, a new algorithm with which a multilayered neural network is effectively organized can be deduced. A self-organizing multilayered neural network, which is named “cognitron”, is constructed following this algorithm, and is simulated on a digital computer. Unlike the organization of a usual brain models such as a three-layered perceptron, the self-organization of a cognitron progresses favorably - Biological Cybernetics September 1975, Volume 20, Issue 3, pp 121–136, Cognitron: A self-organizing multilayered neural network
Fukishima's work was done with roughly 100 neural units. My undergraduate thesis was done in 1988 on a Apple FX model with 5,000 neural units [Note: The Macintosh IIfx was a model of Apple Macintosh computer, introduced in 1990 and discontinued in 1992. At introduction it cost from US $9,000 to US $12,000, depending on configuration, and it was the fastest Mac available at the time utilizing a astounding 40Mhz processor and 128 megabytes of RAM.]. With computing power at that time being non parallel in anything other than costly supercomputers, it is not surprising that this research was dropped in favor of statistical methods. Thus, we had the rise of the quants (quantitative analysis) over the cognitive scientists.

In 1978, Nobel Laureate Gerald Edelman published The Mindful Brain and introduced Neural Darwinism, a large scale theory of brain function. This did not catch on until the late 80s when republished under the title of Neural Darwinism and it revolutionized the world of neural computing in ways that are still just barely being adopted today. Neural Darwinism took cognitrons into a dynamic evolving space with neural groups competing for success and adapting. This dynamic adaptation is one of the core foundations of Neural Cube computing.

Then in 2010 onwards a shift occurred and it was due to computer games. GPU processors with more than 4,000 cores were created with computing power at the 5-10 TERAFLOP level, all available in a $1000 computer. As of 10/2016 16 TERAFLOPS has been reached with this architecture. By 2019 we now have 30 Teraflops in the palm of our hands.

This enabled a lot of people to dust off their 1980s research and begin looking again at this technology area. Google and Facebook have primitive models which can do things like image comparison.

This dramatic boost in processing power allows us to achieve a much higher scale, not hundreds or thousands of neural units, but BILLIONS.  For example, the Xavier Nvidia can achieve over 25 trillion operations per seconds in the size of a small laptop.

This change in scale allows us to start working on deep structure, different rippled neural architectures that support more specific kinds of tasks like the thalamus and hippocampus and Broca's area provide in our own brains.

Other techniques such as limited field excitation and inhibition, dynamic connection growth strategies, and layer coordination with deep connections all provide more complex systems that the simple pattern recognition solutions being emphasized in the marketplace today. The goal is to provide a generic problem solving appliance.

All of this is very different - more three dimensional and alive - than the math matrix tensor models which are now popularized by Google. It is several steps more advanced from todays CNNs or Convolutional Neural Networks - which were designed initially for image recognition and comparison, not language. Our design is principally for language, understanding, and that is what makes us able to produce systems like our EnterpriseNLP regular language search system.

Noonean Cybernetics will leverage these new research principles to produce new technologies which help companies manage their information overload and find and organize critical knowledge. That knowledge edge is critical for companies to be both AGILE and INNOVATIVE in their own markets.

We welcome you to join us on this fantastic journey.

- Gianna Giavelli, founder of Noonean Cybernetics