Good data is a key part of AI innovation and machine learning

0

When the Biden administration launched AI task force earlier this month to create a path to “democratize access to research tools to promote AI,” the goal of access was paramount.

“The task force is made up of some of the best experts in academia and industry,” said Dinesh Manocha, professor of computer science and electrical and computer engineering at the University of Maryland, on Federal Monthly Outlook – Workforce Reallocation Through Automation. “They recognize the importance and they are pushing for more development in the field by making good data available. Data is therefore an essential part of methods based on AI and machine learning.

Manocha said AI is as old as the field, referring to the “founding father” Alan Turing, who he says laid the groundwork in the 1950s.

Machine learning is a subdomain within the larger field of AI,” Manocha said on Federal Drive with Tom Temin. “All of the recent developments in AI, all of the penetration into the real world, have been driven primarily by the enthusiasm for machine learning over the past five to ten years.”

The breadth of AI and machine learning is pretty obvious just by looking at the courses offered at the University of Maryland and what students want to study.

“A lot of computer science majors want to take AI,” Manocha said. “Machine learning itself has become such an important subtopic that we even offer several courses at the undergraduate and graduate levels. “

By focusing on data and algorithms, AI and machine learning mimic the way humans learn. “So you know that one of the big challenges in AI is how to mimic human intelligence, which is still a big open problem,” Manocha said. “There have been a lot of approaches pursued and proposed by wonderful researchers over the past 50 to 60 years.”

Manocha highlighted the great advances in AI and machine learning when talking about the sub-branch of “deep learning,” which mimics human knowledge and thinking. But there is still some way to go.

“If you have data, you can easily get 50-70% accuracy,” Manocha said. “To go from 50 to 70, to 90%, you get 100 times more data. The complexity seems to grow exponentially … So for every 10% relative improvement, we need 100 times more data, and that part of getting 100 times more data, in all possible situations, is very difficult.

Leave A Reply

Your email address will not be published.