June 8, 2017

Hidden Stories: How AI will Advance Careers, Not Steal Them


Sizmek

In the movie Hidden Figures, an African-American woman by the name of Kathy Johnson breaks both the gender and color barrier, in service of her country, by hand-verifying the precise calculations needed to launch John Glenn into orbit and bring him safely back to earth.


A feat of both human intuition and civil rights, Mrs. Johnson has gone down in history for doing something that seems almost completely foreign to technologists in the 21st century: handwriting and verifying computer code, on a chalk board. Today, of course, NASA is on the forefront of computational development, inventing such modern technological necessities as the integrated circuit and communication satellite.

Another NASA employee, Dorothy Vaughan, who was a color and gender pioneer in her own right, saw the computer revolution coming and taught herself Fortran: an early IBM Mainframe computing language. Vaughan took it upon herself to re-train the NASA “computers who wore skirts” so that they would be prepared when digital computers took over the laborious computational work that was originally done by hand.  Johnson too later worked with digital computers to calculate, among other missions, the Apollo 11 flight to the moon.

More than a great story of human perseverance, ingenuity, and excellence, there is a strong parallel between the computer revolution of the 1950s and 1960s and the A.I. revolution happening today. There’s more than a lot of handwringing over Artificial Intelligence and robotics replacing the human workforce, with business leaders, academics and technologists across multiple industries spelling doom for a variety of vocations.  But as many have pointed out, just as Dorothy Vaughan foresaw with the computer revolution, it’s more likely that many human jobs will change, rather than vanish entirely.

If experience with the computer revolution is any indication, it will be quantitative jobs that are most affected at first; or as Andrew McAfee calls it in his book The Second Machine Age, “repeatable” jobs, like factory-work, law, and accounting will be affected before “non-repeatable” work such as cleaning, creative design, strategy or sales.

This is in stark contrast to earlier thinking which posited that advances in computing would lead first to the mechanization of low-cognition work and then to the automation of high-cognition work. Instead, with the constant doubling of computing power, a machine with sophisticated AI or “Deep Learning” can handle highly cognitive work, like filing a tax return, which will generally look at the same set of variables to achieve a similar-looking output, yet that same machine will fail going into a school bathroom and trying to clean it effectively, due to all of the possible externalities: size of the room, number and kinds of appliances/surfaces to be cleaned, type of mess encountered, etc.

And it goes beyond repeatable jobs, as future vocations will require people to be more thoughtful and analytical.  In the words of the venerable Mark Cuban “there’s going to be a greater demand for liberal arts majors in 10 years…[technology] will spit out options for you, you need a different perspective in order to have a different view of the data. And so having someone who is more of a freer thinker [will lead to a successful result].”

Another way to say this is: it’s expensive to do anything for the first time in the “Age of A.I.”, because it requires a whole new model-build, but reproduction (3D printing, stored data analysis, learned rules) is cheap and easy, plus Artificial Intelligence allows machines to iterate (within certain confines) and adapt to a given set of circumstances over time.

Screen Shot 2017-06-08 at 8.41.11 AM

 

So, what does this paradigm do to a quantitative job that also needs to engage in strategic context? A great example of a career that must adapt to changing technological conditions is that of an analyst, a role that traditionally executes a given set of numeric tactics in a spreadsheet (not dissimilar to Kathy Johnson’s role in the 1950s at NASA). In the age of A.I., though, it’s no longer relevant to have a job function that finds the “right” numbers or even the “important” numbers. The analyst of the future will use human intuition to look beyond a numeric goal and explain why those numbers “matter”. The value of a human analyst will be to transcend an initial stated objective and find patterns that give enough proof to build a model that fixes a larger problem.

Therefore, analytics jobs will become more important now than ever, just like the Hidden Figures “human computers” had long and fruitful careers at NASA, even after the advent of the IBM mainframe. To be successful, analysts must work with technology to plug gaps rather than be operational. This leaves two important roles for analytics professionals: that of a data scientist and that of a quantitative consultant. These roles differ from a traditional analyst who looks at a set of numbers in a context and says “here are the calculations I need to perform”.  A Data Scientist says “these inputs are valuable to a given goal” and a Quantitative Consultant says “hey, that’s an interesting story”.

It’s this notion of storytelling and value within a set of quantitative data that will separate an “age of A.I.” analyst from one that has traditionally focused on the correct outcome. And just like when Kathy Johnson was reviewing computer code by hand sixty years ago, how a result is calculated remains, more than ever, as important as the result itself. Moving forward, true progress will be attained when humans and machines work in concert, both focusing on their strengths: a machine’s ability to crunch a tremendous amount of data and solve a problem within a given context and a human’s ability to look beyond those confines in order to glean insights that are applicable to a larger narrative.