Artificial Neural Networks (ANN) was one of the electives I took during my undergraduate studies. Then I couldn’t understood the importance and scope of ANN, but with current advancement in technology, both in hardware and software, has made me realize the importance of ANN. Any technological breakthrough requires at par innovation in software and hardware domain, and ANN has found its way into both these domains.
ANN In Hardware:
Software requires an excellent architecture to run on and recently IBM research announced brain-inspired processor. IBM claims that the chip consumes merely 70 milliwatts, and is capable of 46 billion synaptic operations per second, per watt–literally a synaptic supercomputer in your palm. Such chip are called as neuromorphic chip, that makes heavy use of ANN to solve complex problems in similar manner as a human brain would.
ANN In Software:
To fully utilize any computer architecture, it requires smart software written by smart programmers. But with the help of ANN and artificial intelligence, Google is trying to achieve something unimaginable: they want to get rid of programmers. They want computers to program themselves on the go and come up with solutions to problems that would otherwise require human intervention. They call such machine: Neural Turing Machine.
Both these innovation in software and hardware bring out an old and very important debate of “technological singularity.“ Ray Kurzweil in his book, “The Singularity Is Near: When Humans Transcend Biology“, predicted many future technological outcomes, and how they will supersede human. So far, many of those predictions has been true, and above two innovations are in line with those. The dark side of technological singularity would be that new technologies will be so powerful, that they will transcend the current limits of our understanding.
So, the question is: The Technological Singularity Is Here?
Government of India (GoI) is really good at rolling out new policies, massive infrastructure projects and legislative bills. The question of whether such initiatives will be implemented is imaginary. Department of Electronics & Information Technology (DeitY), Government of India is one ministry, whose sole purpose till date has been introduction of policies targeting Indian tech community.
Few policies/plans initiated by DeitY:
In July 2010, then Human Resource Development Minister, Kapil Sibal unveiled a prototype of a tablet called Aakash (commercially known as Ubislate) and since then it has seen three commercial versions and fourth is in pipeline. The main purpose of this project is to distribute low-cost subsidized tablet to students in universities and colleges across India in order to create an e-learning platform. This idea was well received by many in tech industry and even Vivek Wadhwa saw immense potential in it and so did I.
With time this project saw multiple hiccups, development issues and now with change in governance, to me this project is just days away from being shelved. One of the main threat to Aakash project is recently announced Android One by Google, which was launched today. Android One has been developed for emerging markets with active involvement of Google, similar to the process followed for Nexus devices. Not only that, Android One will see manufacturing from national companies and that is a huge boost to “Make In India.”
Let’s see how Google will smack down Aakash Project:
Computer architecture research involves heavy use of simulators, most of these simulators are cycle-approximate simulators (CAS) implemented using imperative (C), object oriented (C++) and multi-paradigm (Python) languages. Some of the widely used simulators are Gem5, GPGPU-Sim and McPAT.
In an ideal case, in computer architecture one would like to work with languages that are easy on the construction of synthesizable register-transfer-level and close to hardware design. Hardware Description languages (HDL) come under this category, which are heavily used in VLSI research and in industry, but not in computer architecture research. SystemVerilog, Verilog or VHDL are examples of these. [..]
There has always been a debate over which instruction set architecture (ISA) is better, RISC or CISC. The research led by Prof. Krste Asanović and Prof. David A. Patterson at University of California, Berkeley (UCB) takes radical approach towards ISA. They have come up with an Open ISA called RISC-V. It might remind you of SPARC and OpenRISC, however this open ISA is very different in terms of features.
( Image Courtesy: Instruction Sets Should Be Free: The Case For RISC-V )