For years, artificial intelligence (A.I.) and machine learning (ML) were used as catch-all terms for highly technical processes most people couldn’t grasp. However, more and more businesses are transforming A.I. and ML from catchy buzzwords to actual processes, with self-learning algorithms integrated into everything from data analytics to mobile apps.
For many technologists, A.I. and ML are now must-have skills in their repertoire. A growing number of technology jobs request A.I. and ML, and not knowing something about these technologies could prove detrimental in a few years. But how does one get started in A.I. and ML? Are these skills really worth learning?
We spoke to those already immersed in the field to understand how they got their start, whether they feel A.I. and ML are actually critical to learn, and where they feel the disciplines are heading.
How It Began
Ian Paterson, CEO of cybersecurity firm Plurilock, took a very roundabout way into tech. “I started my working career as a rock n’ roll roadie,” he tells Dice, “doing pyrotechnics and lighting design, and fell into the tech world somewhat by accident when I moved to Victoria in British Columbia. I began working in the data science division of Terapeak, which eBay later acquired, and fell in love with the industry.”
After that experience, he continues, “I decided to chart my path in the machine learning and A.I. space, bootstrapping Exapik, a data monetization company, which FlashGate later acquired.”
James Kaplan, CEO and co-founder of MeetKai, was drawn into A.I. and machine learning via a different route. “For me, the original draw to A.I. was based on video games. I loved the idea of learning how ‘intelligence’ on the other side of the screen operated. It was often disappointing to learn how simple the systems were, but the fun of figuring it out was exhilarating.”
But Kaplan’s interest wasn’t just limited to game A.I.: “I found myself thinking when playing a lot of these repetitive grinding games of the early 2000s about how tedious they are. I ended up writing A.I. bots to play the games for me. While this did make the act of playing games a lot less fun, it did motivate me to get much better at A.I. programming.”
Others have taken a far more traditional path into this emerging field. “In college, I took a class about data mining and machine learning and was completely fascinated,” says António Alegria, the Head of AI at OutSystems. “Following my Master’s in Informatics and Computer Engineering, I began working with data-intensive technology and products. My first job was in a large telecom company working on automatically inferring the IT architecture of the entire organization by looking at the network packets going through the network.”
Chuck Everette, Director of Cybersecurity Advocacy at Deep Instinct, arrived a bit later to the A.I. and ML party. “It was not until 2018, when I was personally introduced to Deep Learning (a subset of machine learning) that I saw the possibilities of a true AI framework,” he says. “I have been in cybersecurity leadership roles for multiple Fortune 500 companies as well as security vendors, and I can honestly say that when I was shown the deep learning capabilities when appropriately applied to the cybersecurity areas, my jaw dropped. I did my due diligence and dug into the technical aspects and nuances of this seemingly new technology.”
Everette added: “Over the years, I had used and been exposed to other security products using machine learning, but deep learning was, in a lack of better terms ‘mind blowing’ in its capabilities and accuracy.”
Are A.I. and ML Skills in Demand?
Some disciplines have a sneaky way of punching above their weight. Headline-grabbing features or products can make a language or skillset seem far more used than it actually is. But seasoned technologists argue this isn’t the case with A.I. and machine learning.
“A.I. is at the cutting edge of innovation and will continue to change the world as we know it,” Everette tells Dice. “Honestly, there is virtually no industry that has not already been impacted by A.I. in some way. Some of the most notable growth and expanding areas are healthcare, with the diagnostic imaging and drug R&D heavy usage of A.I. for streamlining and providing rapid improvements to products and services.”
For example, A.I. can accelerate the research behind vaccines such as the one for COVID-19. According to Phys.org, the company that helped develop the Johnson & Johnson vaccine worked with MIT researchers who used A.I. and machine learning to guide their efforts.
Jared Peterson, Senior Vice President of Engineering at SAS, suggests it’s about more than software: “Advances in deep learning, the compute necessary to enable those advancements (e.g., GPUs), and the frameworks that make it all accessible have brought about a renaissance in the world of computer vision and NLP. The pace of research and publishing in these areas has been staggering.”
Volodymyr Kuleshov, co-founder & chief technologist at Afresh, suggests the marriage of hardware and software will mean big things for the future. “A.I. breakthroughs over the past 10 years have come from scaling existing algorithms to large datasets using specialized hardware, and this trend shows no signs of stopping. We’ve witnessed breakthroughs in machine translation, language understanding, and other areas using scaled-up generative models of language, which is another area I anticipate will continue for several more years.”
Lorenzo Bavasso, director, data analytics and AI for BT Global, reminds us that, while A.I. and ML are leveraged for heady things like rapid vaccine creation and are constantly breaking new ground, both are also utilized in technology that people use daily. “We are still evolving very fast in the data industry, in terms of technology, approach and spectrum of opportunity. The overall maturity is nowhere near complete yet. There is a big gap between high performers and ‘the others.’ As in, a lot of A.I./ML capabilities are still at prototype/pilot or ‘Minimum Viable’ status at best and not utilized at scale. There is a lot of focus on making A.I./ML more ‘friendly’ and less niche.
Bavasso added: “For some proof points, we can just look at the industry landscape. A.I./ML as an offering is now part of most of the SaaS portfolios, the number of people involved in the data and A.I. industries is growing significantly and demand by businesses is skyrocketing. Whilst the sophistication is still growing, I think there is more and more focus on scale.”
The Future of A.I. and ML
But where are A.I. and ML actually headed? Responses were wildly varied. Perhaps that’s a nod to the potential A.I. and ML have to dominate technology over the next five to 10 years: if nobody can agree on what the use-cases are, A.I. and ML may be a great fit for almost anything.
“If you asked me that five years ago, I would say our self-driving cars would be talking to our virtual assistant drivers,” Kaplan says. “My gut feeling is that, in the next five years, we will see a push to bring the A.I. that has developed in the pure R&D world to consumer applications.” A.I. and ML can’t always deliver in a consumer context, and it may take time to boost speed and performance; nobody wants a search engine that takes 20 seconds to respond to a query.
Kaplan also points out that, while A.I. will frame how we work in the future, it may also influence how (and why) people are hired:
In the next 10 years, I think we will see the augmentation/enhancement-based A.I. to be the default under which most new hires operate. People will be hired for their skills in using A.I. to make them more efficient. We have already seen this in spaces where AI is present, like drones, manufacturing, and more. By 2030 we will see it permeated into every white-collar and blue-collar job. It is rather exciting for me to think about. Just think about how much more effective people are at work with smartphones; the inclusion of AI in work will be an order of magnitude.
Alegria agrees with that assessment: “In the next 10 years, we're going to see startups applying A.I. to assist in day-to-day strategic decisions of organizations, including algorithmic strategy and planning. This is not about automatic transactional decisions like we already see in credit, lending, and fraud detection, but rather a greater focus on strategic, impactful decision making.”
Paterson also believes “we’re going to see massive adoption of A.I. technologies over the next five to 10 years, one reason being that companies at the edge (like Apple) are using this technology in hardware to enable speed improvements.” Paterson tells us. Less-technical industries will also begin to adopt A.I. into their workflows and products.
Melody Yang, an iOS engineer at Apple, feels ML modeling may soon become as common as JavaScript libraries. “In the next five years, I think off-the-shelf ML models for mobile apps will be easier to integrate and provide more flexibility to customize. Current solutions include pre-trained models for transfer learning, which allows developers to use pre-trained models with their own datasets by tweaking few layers of neural networks or modifying activation function for desired outputs.”
Yang added: “While it accelerates the progress, it has limitations: Pre-trained models can only be used for similar tasks they were trained on and the dimensions of new datasets have to match the ones pre-trained were fed… More companies have focused on making it easy for developers to integrate ML models into apps. This means more solutions will be created for diverse use cases as well as developers of all experience levels.”
From simple solutions to pandemic-beating health services, A.I. and ML are already woven deep into our tech-first society. Across the board, our experts agreed both A.I. and ML will soon become so integrated into our lives that we will have difficulty doing tasks without some level of assistance from a trained model or “smart” assistant.
For technologists, this means A.I. and ML are quickly becoming core competencies, not technobabble to barely understand. These disciplines will dictate how we accomplish most tasks in the near future; overlooking their potential is not advised.