Business Intelligence, Data Analytics, Data Visualization, Machine Learning, Natural Language Processing, AI https://www.linkedin.com/in/keremkargin/

In this blog post, I will first try to explain the basics of Ridge Regression. Then, we’ll build the model using a dataset with Python. Finally, we’ll evaluate the model by calculating the mean square error. Let’s get started step by step.

The main purpose of Ridge Regression is, to find the coefficients that minimize the sum of error squares by applying a penalty to these coefficients. Also known as Ridge Regression L2. In another source, it is defined as follows.

Ridge regression is a model tuning method that is used to analyse any data that suffers from multicollinearity. This…

In this blog post, first, I’ll try to explain the basics of Multiple Linear Regression. Then, I’ll build the model using a dataset with Python. Finally, I’ll evaluate the model by calculating the mean square error. Let’s get started step by step.

The main purpose of Multiple Linear Regression is to find the linear function expressing the relationship between dependent and independent variables. Multiple Linear Regression model has one dependent and more than one independent variable. In another source, it is defined as follows:

Multiple linear regressionis used to estimate the relationship betweentwo or more independent variablesand…

In this blog post, I will first try to explain the basics of Simple Linear Regression. Then, we’ll build the model using a dataset with Python. Finally, we’ll evaluate the model by calculating the mean square error. Let’s get started step by step.

Simple Linear Regression is a statistical method that helps us describe and analyze the relationship between two variables, one dependent and one independent. In another source, it is defined as follows:

Simple linear regressionis used to estimate the relationship between two quantitative variables. You can use simple linear regression when you want to know:- How…

In this blog post, I will try to explain what is Apache Kafka, how it works, when to use it, writing data to Kafka, and reading data from Kafka. Enjoyable readings.

Let’s first take a look at the architecture in the past. In the ’90s CLIENT-SERVER architecture was popular. There was a Monolith structure here. In ** Monolith** structure; all structures are included in a single application. Access to the internet increased over time. And version updates have become a constant need. For this reason, the number of people who prefer this structure has decreased over time.

Then ** Microservices** came to…

In this blog post, I’ll talk about Tokenization, Stemming, Lemmatization, and Part of Speech Tagging, which are frequently used in Natural Language Processing processes. We’ll have information about how to use them by reinforcing them with applications. Enjoyable readings.

`Tokenization`

is the process of breaking down the given text in natural language processing into the smallest unit in a sentence called a token. Punctuation marks, words, and numbers can be considered tokens. So why do we need `Tokenization`

? We may want to find the frequencies of the words in the entire text by dividing the given text into tokens. Then…

In this post, I will try to share the basics of Natural Language Processing, its uses, as well as Text Mining and Natural Language Processing. I intend to mention in a different article than text preprocessing in natural language processing. Enjoyable readings.

Before I define Natural Language Processing myself, let’s look at how it is defined in several sources.

Natural Language Processing, or NLP for short, is broadly defined as the automatic manipulation of natural language, like speech and text, by software.

In another definition, it is defined as follows;

Natural language processing is a form of artificial intelligence (AI)…

There is a lot of talk about data today, many things are being told. However, each data has a story that the data group actually hides within itself. The main thing is to be able to decipher the story hidden behind this data and to tell this story to different people. I see discovering the story of the data as a separate event, and being able to tell people by drawing meaningful conclusions as a result of this story as a different event. …