Suyati Technologies
  • Services
    • Custom Development
    • Salesforce
    • Analytics
    • Enablers
      • RPA
      • Cloud
      • Testing as a Service
  • Intel
    • Blog
    • eBooks
    • Webinars
    • Case Studies
  • About Us
    • Management Team
    • Advisory Board
    • Our Story
    • Testimonials
  • Careers
  • Contact Us
Suyati Technologies
  • Services
    • Custom Development
    • Salesforce
    • Analytics
    • Enablers
      • RPA
      • Cloud
      • Testing as a Service
  • Intel
    • Blog
    • eBooks
    • Webinars
    • Case Studies
  • About Us
    • Management Team
    • Advisory Board
    • Our Story
    • Testimonials
  • Careers
  • Contact Us
Suyati Technologies > Artificial Intelligence > The difference between machine learning and artificial intelligence

The difference between machine learning and artificial intelligence

by Sahana Rajan December 15, 2016
by Sahana Rajan December 15, 2016 0 comment

machine-learning-and-artificial-intelligence

Artificial Intelligence (AI) has entered our daily lives like never before and we are yet to unravel the many other ways in which it could flourish. Spanning speech recognition, virtual personal assistants, video games, smart cars and fraud detection, we can witness AI taking leaps of progress towards the futuristic vision of AI-driven tech-world (think of Salesforce’s Einstein, modern chatbots!).

Talks on AI are generally freckled with the terms, ‘Machine Learning’ and ‘Deep Learning’. Moreover, AI and Machine Learning are often interchangeably used. Though they are closely related to each other, they have different meanings. This article will go through such differences and point out how they are interconnected to each other.

Broad Differences between Artificial Intelligence and Machine Learning

While there are many arguments on specific differences between them, most researchers agree broadly on two main dissimilarities: Artificial Intelligence entered the landscape of technology much before Machine Learning and more importantly, that Machine Learning is a subset of Artificial Intelligence. Around 1950s, we see a growing excitement about artificial intelligence which was followed by the growth of machine learning around 1980s. This was in turn succeeded by the breakthrough of deep learning which is considered a subset within Machine Learning. In its early undeveloped form, artificial intelligence was primarily an idea which took shape in our daily lives through machine learning and which could be utilized even more significantly through deep learning.

Till around 2012, artificial intelligence was swinging between the two extremes of looking like complete science-fiction and of being only an arrangement of mechanical parts put together very smartly (smart but not intelligent). It was about 2015 that AI boomed through the technological advances populating our lives already. Once GPUs became easily available, parallel processing became a quicker and more economical option. It was also more powerful, owing to the possibility of “practically infinite storage and a flood of data of every stripe (that whole Big Data movement)- images, text, transactions, mapping data, you name it.”

Bringing AI to the Real World: Transition from Idea to Actual

The first real-life application of artificial intelligence was the chess-game on computers. Compare this to the out-of-the-world vision presented in the 1956 conference: complex machines which were intelligent, exactly in the sense human beings were. That is, they possess sensitiveness (to their environment- as we do, through our sense organs), rationality and thought as well. The folklore of AI is the large collection of movies, TV shows and novels that explore such machines. Thus, we have the big-picture in place: we know what we are looking for.

We, however, are missing the small-scale technologies which can carry out the tasks that we do (perhaps, in a better way). This is the motive of Narrow AI. It is geared towards producing technologies that can fulfill specific tasks, which when put together, would help us build the big picture. Consider image categorization in Pinterest and face-recognition in Facebook.

I am the building blocks of AI! : With love, Machine Learning

Machine learning is a subset of AI focused on running data through algorithms to make predictions. Rob Schapire, a Princeton professor of Theoretical Machine Learning puts this in simple terms: ”Machine learning studies computer algorithms for learning to do stuff.”

At the most basic level, machine learning is the process of parsing data through algorithms, learning from these data and then being able to make predictions about the world. Thus, instead of completing a task with software schedules through series of instructions, we have a machine trained to digest humungous data, run through algorithms giving it the capacity to learn how to perform a task.

While they dipped into the dreams of advanced AI, the ancestors of this idea also heavily discussed the algorithmic ways to make it happen. These included inductive logic programming, Bayesian networks, decision tree learning, clustering and reinforcement learning along with many others. None of these could succeed in fulfilling the AI goal and also failed to make happen narrow AI objectives.

Computer vision was the closest we could get to applying machine learning earlier. This involved producing a series of hand-coded classifiers which would allow the computer to identify an object, recognize where it started and ended and what signs were put up. However, apart from the huge amount of energy spent in creating these classifiers in form of algorithms (so that the machine could find sense in the image and learn which sign it was), the machine would fail to detect the object if there were any obstructions on the way. But mistake not that these algorithms were all for nothing.

Gradually, we developed algorithms which stimulated learning allowing the huge leaps of progress we witness today.

 

 

Making your job simpler, Machine Learning!: With love, Deep Learning.

Neural networks were another early algorithmic approach of machine learning. These artificial neural networks were motivated by how we understood the biology of our brains. In our brain, a neuron can link to any other neuron within a particular physical range. However, in the artificial neural network, there are distinct layers along with corresponding connections and directions of data dissemination. Consider a task ‘A’ which is divided into a series of stages where each stage is completed by every layer until you reach the last layer where the final output is produced.

One of the best examples is the detection of a stop sign. A stop sign is analyzed and a series of its characteristics are figured out: the shape, letters, size and motion among others. The task of artificial neural network is to identify if the given sign is a stop-sign or not. Using a probability vector, the network comes up with an output. This could take the form that the system is 89% sure it is a stop-sign, 5% sure it is a speed-limit sign, 2% confident it is a pigeon on a branch and so on. The biggest challenge with results of these kinds was that these are purely computational. We would need to produce millions of algorithms which would tell the machine what-it-is-not before it can hit the answer. This did not deter Geoffery Hinton and his team at University of Toronto who continued “algorithming” for supercomputers for proving the concept.

It was the coming of GPUs which really changed the change. In 2012, Andrew Ng of Google made the neural networks more and more complex allowing them to digest huge amounts of data (in his case, this was images from millions of YouTube videos). This was when deep-learning came into being: deep meaning higher and higher complexity.

With deep learning, machines have learned to recognize images with greater accuracy than human beings. They can identify cats, indicators of cancer in blood samples and tumors in MRI scans.

AI prophecies might come true! Courtesy: Deep Learning

Owing to the approach that deep learning uses (higher complexity and greater accuracy), it is possible to move towards AI. Since deep learning divides different tasks into stages, it could manage to be a model for assistants: movie suggestions, driverless cars, preventive healthcare.

AI has been in our stories for centuries. While the idea surrounded us for a long time, we faced the challenge of coming up with ways to actualize them. Today, we have found technologies to walk towards the happy-ending of our stories.

We had dreams. Now, we have its reality.

Share your comments below. Contact us to know more about Suyati’s digital transformation solutions.

 

Related Posts:

  • Defining Machine Learning in Today’s Enterprise Ecosystem
  • How Salesforce Einstein stands out as an AI platform
  • Microsoft Cognitive Toolkit’s new version launched with Python support
  • Role of chatbots in enhancing customer experience
  • Insight driven sales and marketing with Salesforce AI
Sahana Rajan
Sahana Rajan
0 comment
0
FacebookTwitterLinkedinTumblr
previous post
Why 2016 is called the Year of Salesforce Lightning
next post
Slack and Google join hands for an Improved Enterprise Chat Service

You may also like

Boost Customer Experience through Agility and a Customer-First...

March 3, 2023

Einstein Analytics & Salesforce CRM Empowering Sales Teams

February 7, 2023

Resolve Cases Easily with Interactive Einstein Search Answers

February 7, 2023

2023 AI Landscape: Key Developments

February 7, 2023

An analytical breakdown of DataOps and its core...

January 5, 2023

Watch Out for These 5 Cloud Computing Trends...

January 5, 2023

How Artificial Intelligence is Improving Customer Experience

January 5, 2023

Top 10 Strategic Technology Trends for 2023

January 4, 2023

How can Enterprises Benefit from DevOps?

January 4, 2023

Leveraging DevOps Capabilities in the Telecom Sector

January 3, 2023

Leave a Comment Cancel Reply

Save my name, email, and website in this browser for the next time I comment.

Keep in touch

Twitter Linkedin Facebook Pinterest

Recent Posts

  • The Top Six Software Testing Predictions for 2023

    March 10, 2023
  • Ensure a Smooth Transition to Serverless Architecture

    March 10, 2023
  • Boost Customer Experience through Agility and a Customer-First Mindset

    March 3, 2023

Categories

  • Twitter
  • Linkedin
  • Facebook
  • Instagram
  • Services
    • Custom Development
    • Salesforce
    • Analytics
    • Enablers
      • RPA
      • Cloud
      • Testing as a Service
  • Intel
    • Blog
    • eBooks
    • Webinars
    • Case Studies
  • About Us
    • Management Team
    • Advisory Board
    • Our Story
    • Testimonials
  • Careers
  • Contact Us

© 2022 Suyati Technologies


Back To Top
Suyati Technologies

Popular Posts

  • Leveraging India for Innovation

    June 5, 2010
  • Are You Ready To Transform Your Business?

    April 26, 2011
  • Delivering your outsourced IT projects on-time and under-budget

    December 5, 2011
  • IT Outsourcing – You never get what you ask for, but what you negotiate!

    May 11, 2012
  • The new MS office: What’s really new?

    August 13, 2012
© 2022 Suyati Technologies

Read alsox

RPA for Manufacturing Industry – Redefining Manufacturing Efficiencies

July 13, 2020

Benefits of Publishing Enterprise Content with Sitecore 8.2

December 22, 2016

CIOs Still Bullish on Outsourcing – Part 1

October 14, 2011
Suyati Logo

Leaving So Soon?

Know more about business technology platforms for your enterprise.

Thank you!

We're glad to have you as part of our community. You'll start receiving updates shortly. Please feel free to contact us anytime with feedback or suggestions.

By continuing to use this website you agree with our use of cookies. Read More Agree