What seems like a lifetime ago, Myspace was my first exposure to the concept of "embedding."
Friends would embed their favorite songs or videos onto their profile page’s backend, and like magic, there it was playing on the screen. With a few simple scraps of code, it was now a part of the page, like an organ sewn into a body — pumping like it was there the whole time. Today, there are a multiple types of AI embeddable in a similar fashion.
Myspace came and went (don’t tell J.T.), but embedding as a tech principle was firmly embedded in society’s fabric. And in 2018, you may hear the word more than ever, thanks to the inescapable innovations and adoption of artificial intelligence software. Embedded artificial intelligence (AI) is a major step forward for this technology, “moving the chains” of AI toward its full potential in everyday use.
For the purposes of the conversation: What is AI? It’s a wide-ranging topic that boils down to technology that can think, in a manner of speaking. Buzzwords like “machine learning” and others are being used to describe the newfangled programming that allows cloud applications and computer systems to adjust behaviors without manual input from users. In short, they learn.
Embedded AI is the advancement of these technologies to include software and small devices used by everyday people, with integrated machine/deep learning designed to simplify the user’s day-to-day tasks. You remember that Myspace song you embedded to autoplay when someone loaded your page? It’s like that, but the song changes based on the mood of the room or which people are in it, and adjusts the volume based on the time of day, etc. We can see the beginnings in Alexa and Google Home — with untold upgrades on the way — but your car, clothes, consoles and computers are next in line.
Colin Angle, co-founder, CEO and Chairman of the Board at iRobot, once said: “It’s going to be interesting to see how society deals with artificial intelligence, but it will definitely be cool.” Let’s file embedded AI into the “cool” folder for now, although there is little doubt it will stoke some concerns as it grows in size and results. The underlying hopes with embedded AI are that businesses and individuals will soon be freed from some humdrum, routine tasks — and able to direct that time and energy elsewhere. This revolution is well underway, and many believe it will soon alter our lives as swiftly and thoroughly as smartphones did, with maybe greater potential.
There are three primary types of AI, all relevant to embedded intelligence:
- Natural language processing (NLP)
- Computer vision (image recognition)
- Speech/voice recognition software
We delve into these and other modern tech terms in our recently published Digital Trends series. As humans we rarely consider just how nifty it is to register, absorb and apply the things we hear and see. Every conversation we have and skill we develop is owed in part to our comprehension of this intake. Giving software even a sliver of this ability is destined change the workday as we know it. Unfortunately, some jobs will be deemed unnecessary as a result of embedded AI and intelligent hardware devices (let’s just call them robots). For a majority of the workforce, though — and the rest, while we await the perfection and mass production of robots — these developments bode well as productivity boosters and stress reducers.
We’ve previously covered some specific use cases for AI in AI in human resources, AI in accounting and AI in sales teams — even the legion of screenwriters in Tinseltown. But in reality, virtually everyone in your office will feel the impact of software with embedded AI in the coming years. And just as it happened with smartphones, we’ll soon have trouble remembering a world without it.
In the development community, embedded AI has been making a steady crawl from the blueprints to the big screen. A major example is the Embedded Systems Conference, which now holds annual events in Silicon Valley, India, Boston and Minneapolis. Here, attendees can preview the latest advancements in applicable AI, and workshop with experts about the nitty gritty — aka, how these innovations will affect them and their business.
Which then begs the question: How will certain types of AI, including embedded AI, affect you?
Think automated sorting of files based on content. Think prioritization and scheduling of tasks. Think data entry and quality checking, and strategy recommendations based on successes or failures. Tools that adjust shape and strength in response to field conditions, and assembly lines that stop, speed up, or pivot without causing a fuss. Security cameras that send alerts when certain people are identified, and open doors for others. The list goes on.
Some of the most skilled, forward-thinking companies in in the world are going all-in on embedded AI, with these goals in mind along with others we can’t begin to envision. This field of tech goes hand-in-hand with the internet of things, which involves devices that can communicate with each other and generate lakes of data based on trends in activity, demand and external factors. Stir in embedded AI, and the possibilities are seemingly endless; Terminator and 2001: A Space Odyssey comparisons abound, from both a “woah” and a “yikes” perspective. And this is definitely a case where both sides are worth hearing out.
Like any huge step forward in technology (e.g., blockchain, Snapchat, AR/VR), the path forward for embedded AI will be bumpy. It’s one more developing story to follow, debate and give a fair shake before shaking your fists. Even the people creating this tech aren’t entirely sure of its ceiling or how the mainstream will accept it. But as Angle put it, it’s hard for anyone to deny the “cool” factor, at least on paper.
The nuts and bolts of embedded AI are explored in research by influencers like Microsoft and Oracle. Industry leaders like these not only have the resources to experiment and find success, but the power to reach a wide audience with embedded solutions right out of the gate. In November, Salesforce announced myEinstein, a point-and-click platform on which companies can plop AI into their own custom apps. At the same time, upstarts like Uncanny Vision are making headlines for their own breakthroughs — the India-based team was recognized alongside other innovators in Amazon’s AI Awards.
Suffice to say the race is on. But as we recently learned when exploring the best tech cities across the United States, there’s no cap when it comes to creation. This new frontier only serves to benefit from collaboration, transparency and patience from vendors and consumers alike. If we’re able to accomplish that, then embedding will be worth betting on.