Introduction to Artificial Intelligence:
Now a days Artificial Intelligence are using in all over the world. By using artificial intelligence, we can save lot of time.
So learning artificial intelligence is must for all of us.
Artificial intelligence is a current trend that draws on many disciplines, including creating blogs, youtube videos, statistics, data analytics, neuroscience, linguistics, hardware and software engineering, philosophy, and psychology.
Businessmen’s are learning artificial intelligence to develop their business for next level.
What is Artificial Intelligence?
Artificial intelligence is technology that enable to perform tasks normally requiring human intelligence such as visual perception, speech recognition, problem solving and decision-making.
Artificial intelligence refers to the recreation of human insights in machines.
History of Artificial Intelligence:
1950 – Alan Turning published a landmark paper “Computing machinery and intelligence” in which he speculated about the possibility of creating machine that think.
1951 – Game AI – Christopher starchey composed a checker’s program and Dietrich prinz composed one for chess.
1956 – The birth of AI – John Mc Carthy first coined the term ‘Artificial Intelligence’ in 1956 at the Dartmouth Wirkshop.
1959 – First AI Laboratory – The research on AI began in MIT AI Lab.
1960– General Motor Robot – First robot was introduced to general motors assembly line.
1961 – First chatbot – The first AI chatbot called Eliza introduced in 1961.
1980 – A form of AI program called “expert systems” was adopted by corporations around the world and knowledge became the focus of mainstream AI research.
An expert system is a program that answers questions or tackles issues around a particular domain of information, utilizing coherent rules that are determined from the information of experts.
1997 – IBM Deep Blue- IBM’s DeepBlue beats world champion Garry kasparon in the game of chess.
2007– Image processing systems – A group at UMassAmherst released Labeled Faces in the Wild, an annotated set of images of faces that was widely used to train and test face recognition systems for the next several decades. It 2009, it was a useful body of training data and a benchmark for testing for the next generation of image processing systems.
2009 – Big Data – McKinsey Global Institute reported that all field in the US economy had stored data more than 200 terabytes “. This collection of information was known as big data.
2017 – Large language model – The AI boom starts in the form of transformer architecture in 2017, developing of large language models which it is behave like a human.
2020 to present – the release of large language models (LLMs) such as ChatGPT and this is the new AI era began.