Computer hardware technology has always followed a simple
trend of shrinking in size and growing in processing capacity. Just look at the
technological progression of cell phones, which were once the size and weight
of bricks and can now fit comfortably on the palm of anyone’s hand. The amount
of memory and data modern smartphones are capable of processing would have been
labeled as impossible 30 years ago. 10 to 20 years from now, we will
undoubtedly see new forms of computer hardware emerge and influence various
industries.
An article released a month ago talks about a new kind of computer
chip, scheduled to be released in 2014, which will change the way computers
process information. This neural chip is based on the biological nervous
system, how neurons react to stimuli and connect with other neurons to share
and interpret information. This is different from modern computing in that
computers nowadays are programmed with all the information they need to
function, like a recipe. The neural chip allows computers to function on new
tasks beyond what they were initially programmed to do as well as other
functions which normally would take painstakingly long hours of programming to
achieve. This chip design is different than the typical design of computers
which were heavily influenced by the ideas of John von Neumann, where microprocessors
perform operation in binary code and store information in the processor or in
higher storage chips or disk drives. The data are moved from in and out of the
processors short term memory while the computer carries out a programmed
action. The result is then moved to the main memory. This new processor has
connections which mimic the biological synapses, the connections between
circuits are organized according to correlations in the data that the processor
has already learned. New information changes the network in the processor,
programming the next actions of the computer similar to how new information
impacts our decisions.
A new generation of artificial intelligence systems may be
possible with this chip, as these systems will be able to perform various tasks
with ease such as: speech, listen, navigate, see, and manipulate or control
objects. Facial and speech recognition will be impacted by such a change, as
now a computer system will be able to learn how we look and sound after a few
meetings. Imagine that instead of unlocking the door to your house with your
key, you could also unlock the door by simply walking up and saying anything
(with a little help from biometric sensors and scanners, of course). While these visual or
audio locks may already exist, their implementation is limited to certain
phrases or functions and even cost.
Another angle to consider this development: we all had to
learn how to walk. Sure, our leg muscles weren’t developed enough until we were
around 1 year old but we still had to learn how to balance ourselves before we
could walk with confidence. Computers with neural chips installed will be able
to learn from trial and error on their own, not only to walk if they are a
robot but to recognize people, cats, dogs, the habits of ordinary people or
even a burglar.
Resource:
Markoff, John.
"Brainlike Computers, Learning From Experience." New
York Times 28 Dec 2013, n. pag. Web. 13 Jan. 2014.
<http://www.nytimes.com/2013/12/29/science/brainlike-computers-learning-from-experience.html?partner=rss&emc=rss&_r=1&>.
Ian,
ReplyDeleteYou mentioned a trial and error approach to this advanced technology. I personally do not believe it can be comparable to an infant walking. If this chip can lead into some sort of artificial intelligence, it would be critical that it performs as expected at all times. While this type for advancement is incredible, it can also lead to dangerous outcomes. If important tasks are trusted to be performed by this artificial intelligence, trial and error is not an option. If your analogy is correct than that means one would not be able to trust advanced tasks to an AI anymore than one could trust a human being to them. While this technology would make computers more human-like, it would completely take away from what they are designed to accomplish and how they are expected to perform.
Mike
Do you feel like this chip will be able to aid in the construction industry, if so in what ways? I feel like trusting AI to utilize BIM or running computer simulations would benefit the industry because it would be saving time, but also, as discussed in class, take away jobs.
ReplyDeleteAdditionally, I agree with Mike: we've been taught to never trust a computer model - I can only assume the same would apply to AI. Have you come across any numbers discussing the accuracy of the chip? Trial and error isn't an option in many industries. I feel a chip like this would take a few decades to become involved in our industry. The maker of Nest was quoted recently saying he believes the 'Smart Home' is still at least a decade off. I'd be curious to see what types of tests this chip will be put through come its release date in 2014.