As many of you ask for coupons for our courses. Here is the list of coupons to get any of our Python courses at a discounted price.
Before we start, here is a gift for you; join the first course for free.
Tutorials on Natural Language Processing, Machine Learning, Data Extraction, and more
As many of you ask for coupons for our courses. Here is the list of coupons to get any of our Python courses at a discounted price.
Before we start, here is a gift for you; join the first course for free.
Facebook is the biggest social network of our times, containing a lot of valuable data that can be useful in so many cases. Imagine being able to extract this data and use it as your project’s dataset.
In this tutorial, you are going to use Python to extract data from any Facebook profile or page. The data that you will be extracting from a predefined amount of posts is:
You will be extracting comments from posts as well and from each comment:
Continue reading “Extracting Facebook Posts & Comments with BeautifulSoup & Requests”
News plays an essential role in our daily life. Whether you want to create your own news website, or carry out a data analysis project, there is often a need to fetch different types of news articles or headlines to aggregate the news from different sources at one place or analyze them. Applications can be many, and fortunately, there is a way to retrieve the news articles from the web, from different sources and the same time.
In this tutorial you will learn how to extract news headlines and articles using the News API and save them to a CSV file.
Continue reading “News API: Extracting News Headlines and Articles”
Spreadsheets are among the most popular office utilities in the world. Almost all professions use spreadsheets for a wide of ranger reasons, from tallying numbers and displaying them in graphs to doing unit conversions, just to mention a few.
Google Sheets is one of the more popular spreadsheet applications available today. Backed up by the Google platform, it has some nifty features that make it stand from its competitors.
In this tutorial, you will learn how to use the power of Google Sheets API and Python to build a simple language translator.
Continue reading “Create a Translator Using Google Sheets API & Python”
Chatbots are intelligent agents that engage in a conversation with the humans in order to answer user queries on a certain topic. Amazon’s Alexa, Apple’s Siri and Microsoft’s Cortana are some of the examples of chatbots.
Depending upon the functionality, chatbots can be divided into three categories: General purpose chatbots, task-oriented chatbots, and hybrid chatbots. General purpose chatbots are the chatbots that conduct a general discussion with the user (not on any specific topic). Task-oriented chatbots, on the other hand, are designed to perform specialized tasks, for example, to serve as online ticket reservation system or pizza delivery system, etc. Finally, hybrid chatbots are designed for both general and task-oriented discussions.
Sentiment Analysis is a special case of text classification where users’ opinions or sentiments regarding a product are classified into predefined categories such as positive, negative, neutral etc. Public sentiments can then be used for corporate decision making regarding a product which is being liked or disliked by the public.
Both rule-based and statistical techniques have been developed for sentimental analysis. With the advancements in Machine Learning and natural language processing techniques, Sentiment Analysis techniques have improved a lot.
In this tutorial, you will see how Sentiment Analysis can be performed on live Twitter data. The tutorial is divided into two major sections: Scraping Tweets from Twitter and Performing Sentiment Analysis.
Continue reading “Scraping Tweets and Performing Sentiment Analysis”
Text Classification is a process of classifying data in the form of text such as tweets, reviews, articles, and blogs, into predefined categories. Sentiment analysis is a special case of Text Classification where users’ opinion or sentiments about any product are predicted from textual data.
In this tutorial, you will learn how to develop a Sentiment Analysis model that will use TF-IDF feature generation approach and will be capable of predicting user sentiment (i.e. view or opinion that is held or expressed) about 6 Airlines operating in the United States through analysing user tweets. You will use Python’s Scikit-Learn library for machine learning to implement the TF-IDF approach and to train our prediction model.
Continue reading “Twitter Sentiment Analysis Using TF-IDF Approach”
REST technology is generally preferred to the more robust Simple Object Access Protocol (SOAP) technology because REST leverages less bandwidth, making it more suitable for internet usage.
REST APIs are all around us these days. Almost every major service provider on the internet provides some kind of REST API. There are so many REST clients available that can be used to interact with these APIs and test requests before writing your code. Postman, is one of the world’s leading API Development Environment (ADE) with so many features baked in.
In this tutorial, you are going to learn how to use Postman to make API calls with and without authorization.
Twitter has been a good source for Data Mining. Many data scientists and analytics companies collect tweets and analyze them to understand people’s opinion about some matters.
In this tutorial, you will learn how to use Twitter API and Python Tweepy library to search for a word or phrase and extract tweets that include it and print the results.
Note: This tutorial is different from our other Twitter API tutorial in that the current one uses Twitter Streaming API which fetches live tweets while the other tutorial uses the cursor method to search existing tweets. You can use the cursor to specify the language and tweet limit and you can also filter retweets using cursor.
Continue reading “Twitter API: Extracting Tweets with Specific Phrase”
GitHub is a web-based hosting service for version control using Git. It is mostly used for storing and sharing computer source code. It offers all of the distributed version control and source code management functionality of Git as well as adding its own features.
GitHub stores more than 3 million repositories with more than 1.7 million developers using it daily. With so much data, it can be quite daunting at first to find information one needs or do repetitive tasks, and that is when GitHub API comes handy.
In this tutorial, you are going to learn how to use GitHub API to search for repositories and files that much particular keywords(s) and retrieve their URLs using Python. You will learn also how to download files or a specific folder from a GitHub repository.
Continue reading “Searching GitHub Using Python & GitHub API”