Inside the Briefcase

Augmented Reality Analytics: Transforming Data Visualization

Augmented Reality Analytics: Transforming Data Visualization

Tweet Augmented reality is transforming how data is visualized...

ITBriefcase.net Membership!

ITBriefcase.net Membership!

Tweet Register as an ITBriefcase.net member to unlock exclusive...

Women in Tech Boston

Women in Tech Boston

Hear from an industry analyst and a Fortinet customer...

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

IT Briefcase Interview: Simplicity, Security, and Scale – The Future for MSPs

In this interview, JumpCloud’s Antoine Jebara, co-founder and GM...

Tips And Tricks On Getting The Most Out of VPN Services

Tips And Tricks On Getting The Most Out of VPN Services

In the wake of restrictions in access to certain...

Human-Plus-Machine Computing: Beyond Moore’s Law

December 5, 2016 No Comments

Featured article by Adam Devine, Vice President of Marketing, WorkFusion

mlearning

How do we move forward in a world where Moore’s Law no longer holds true? For five decades, Gordon Moore’s famous prediction about processing power doubling about every two years held firm. It was a reliable sort of constant as innovators continued to increase the number of transistors per square inch on integrated circuits. But all good things must come to an end, and Moore’s Law has been confounded by another, more immutable law: physics. Though transistors have shrunk to the size of molecules, they cannot—at least, not currently—shrink to the atomic level. However, though processing power has hit its limit, innovation in other areas has not. This is an opportunity for the combined forces of artificial and human intelligence to shine.

About a year ago, Irving Wladawksy-Berger offered one of the more interesting observations on the end of this era in his post, Computing Beyond Moore’s Law. He compared a post-Moore’s Law era IT industry to the Cambrian Explosion, which was when “evolution deemed the cell to be good enough” and allowed “the development of complex life forms.” In the same fashion, as biological development 542 million years ago, Mr. Wladawsky-Berger wrote that, “innovation has now shifted to the creation of all kinds of digital life forms and to the data-driven analytic algorithms and cognitive designs that infuse intelligence into these artificial life forms.” He suggested that, outside of the linear hardware path of exponential computing gains, there were other vectors for growth.

His observations are in keeping with the human capacity to continue innovating. Cloud computing is now powerful enough to enable the transfer of knowledge from a globally distributed enterprise workforce to a variety of machine learning algorithms brought together on a software platform that can be trained to perform a variety of complex, time-consuming tasks. Rather than confinement to hardware-based transistors, processing happens at the desktop level as humans make decisions, in the Cloud as classic Markov models or newer Deep Learning Neural Networks train on these decisions and become increasingly autonomous in the given tasks, and on company servers as automation deploys the rules that are incrementally and dynamically learned from humans.

Right now, there’s a bit of a lag in advances related to hardware computing, which should be an added incentive to increase the level of R&D that goes into improving the cognitive power of software and exploring the myriad ways that humans and machines can interact to improve all manner of information processing. There are certainly bigger strides to be made in human-machine collaboration or, as John Markoff describes in his excellent book, Machines of Loving Grace, intelligence augmentation (IA). In fact, three years ago, scientists at Harvard went an entirely different direction in transistor improvement and created one that behaves like a brain, calling it a synaptic transistor. Like “wetware” (biological computing), the synaptic transistor is not binary and gets stronger as it relays more signals – the same way neural connections strengthen to enable learning.

Organizations have gotten the message that the next great innovation in computing probably isn’t going to come from trying to extend Moore’s Law. Harvard and other centers of innovation are focusing on creating a new era in which the human intellect partners with artificial intelligence for exponential gains in productivity. Such gains will boost profits and better engage employees at the same time. Moore predicted that his theory would hold “for the foreseeable future,” and he was right; he couldn’t have seen what is possible in our world today. We’ve moved from huge black-and-white TV sets to hand-held entertainment centers, and there’s no telling where human-plus-machine processing will take us.

About the author:

Adam leads market development, product and brand marketing and strategic partnerships at WorkFusion. He began his career in management consulting in the Financial Institutions Group at BearingPoint and has spent the past 14 years in tech product marketing and advertising. He was most recently director of strategy at 360i. Adam holds a bachelor degree from the University of Vermont.

 

Leave a Reply

(required)

(required)


ADVERTISEMENT

DTX ExCeL London

WomeninTech