Artificial intelligence, machine learning, deep learning. all are increasingly popular concepts. It seems like something from the future, but it is being used more and more in all areas of life. Not only on the Internet, not only on computer vision. They are diagnosing illnesses, optimizing problems, driving cars and a host of other things
What are we going to talk about
We will not publish news. We will try to get useful information as usual in Ikkaro. Collecting tools, trying to explain concepts, doing Machine Learning examples. Applications in different fields, such as IoT, and any interesting datasheet that you find.
I am not an expert. I am in the middle of learning but I think I can contribute the knowledge that I acquire and improve with it.
The idea of the project is give voice instructions to interact through our PC or our Raspberry Pi using the Voice-to-text Whisper model.
We will give an order that will be transcribed, converted to text, with Whisper and then analyzed to execute the appropriate order, which can be from executing a program to giving voltage to the RaspberryPi pins.
I am going to use an old Raspberry Pi 2, a micro USB and I will use the Voice-to-text model recently released by OpenAI, Whisper. At the end of the article you can see a little more whisper.
I just did the developer course for Google Machine Learning Crash Course. An introductory course, where they give you the basic concepts and see examples of real implementations with TensorFlow. These examples are what have encouraged me to do so.
Collaboratory, also called Google Colab It is a product of Google Research and is used to write and run Python and other languages from our browser.
I leave you a guide for beginners that perfectly complements this article
Colab is a hosted Jupyter, installed and configured, so that we do not have to do anything on our computer but simply work from the browser, on resources in the cloud.
It works exactly the same as Jupyter, you can see our article. They are Notebooks or notebooks based on cells that can be texts, images or code, in this Python step, because unlike Jupyter Colab at the moment only the Python kernel can be used, they speak of later implementing others such as R, Scala, etc, but no date is stated.
Looking at the historical data offered by a meteorological observatory in my city, I see that they only offer them graphically and for download as PDF. I don't understand why they don't let you download them in csv, which would be much more useful for everyone.
So I've been looking for one solution to pass these tables from pdf to csv or if someone wants to format Excel or Libre Office. I like csv because with a csv you do everything you can handle it with python and its libraries or you can easily import it into any spreadsheet.
As the idea is to get an automated process, what I want is a script to work with Python and this is where Tabula comes in.
In this article I leave a Anaconda installation guide and how to use your Conda package manager. With this we can create development environments for python and R with the libraries we want. Very interesting to start messing with Machine Learning, data analysis and programming with Python.
Anaconda is a free and Open Source distribution of the Python and R programming languages widely used in scientific computing (Data ScienceData Science, Machine Learning, Science, Engineering, predictive analytics, Big Data, etc).
It installs a large number of applications widely used in these disciplines all at once, instead of having to install them one by one. . More than 1400 and that are the most used in these disciplines. Some examples
After finishing the Machine Learning course, I was looking where to continue. The development environments used in the Octave / Matlab prototyping course are not what people use, so you have to jump to something higher quality. Among the candidates that have been recommended to me the most is Keras, using backend TensorFlow. I'm not going to go into whether Keras is better than other tools or other frameworks or whether to choose TensorFlow or Theano. I'm just going to explain how it can be installed in Ubuntu.
First, I tried to install it from the documentation of the official pages, and it was impossible, I always had an error, an unresolved question. In the end I went to find specific tutorials on how to install keras in Ubuntu And yet I have spent two days spending a lot of time at night. In the end I have achieved it and I leave you how I have done it in case it can pave the way for you.
As we are going to follow the steps recommended by the websites that I leave you from sources at the end of the tutorial, we are going to install PIP that I did not have, to manage the packages. pip on linux it's just that, a package management system written in python.
It is a free course on Machine Learning, taught by Andrew Ng. once finished if you want you can have a certificate that endorses the skills achieved for € 68. It is divided into 3 pillars, videos, Exams or Quizz and programming exercises. It is in English. You have subtitles in several languages, but the Spanish are not very good and sometimes they are out of date, much better if you put them in English.
It is quite theoretical. But maybe that's why it seems like a good way to start because you are not only going to learn what to do but why you do it.