In this episode of 'Network Nex,' we will explore protocols and delve into the reasons behind the necessity of the OSI model. So let's start 🚀🚀
Protocols refer to a set of rules that define how data is transmitted and received over a network. They ensure a standardized communication process between devices to facilitate successful data exchange.
When the sender machine transmits data, the data undergoes a series of protocols(or follows some set of instructions) before leaving the sender machine. Upon reaching the receiver machine, the received data passes through the receiver's end protocols before being fully received by the user.
Protocols provide both mandatory and optional functionalities to ensure standardized and efficient communication.
Error Control - It involves the detection and, in some cases, correction of errors that may occur during the transmission of data.
👉when user1 sends a message "m," the receiver ideally should receive and interpret the same message "m." However, due to various factors such as noise, interference, or intentional hacking, the message may get corrupted or altered during transmission.
Flow Control - Imagine you have a friend sending you text messages really quickly, one after another. Now, think of your phone as having a limited space to store these messages before you can read them. If your friend keeps sending messages faster than you can read and process them, your phone's storage (or buffer) will fill up, and you might miss or lose some messages.
👉In the world of computer networks, a similar thing can happen. The sender is like your friend, and the network or the receiving device has a limited capacity to handle incoming messages. If the sender keeps bombarding the network with messages without considering how fast the receiver can process them, it can lead to congestion and potential loss of data.
Multiplexing / Demultiplexing - If P1, P2, and P3 are processes on your machine, and each is responsible for different tasks (e.g., web browsing, file transfer, and email), the multiplexing process would combine the data generated by these processes into a single stream. On the receiving end, demultiplexing would ensure that the data related to web browsing goes to the web browser process, file transfer data goes to the file transfer process, and so on.
We will discuss this topic later in this series.
Optional functionalities -
Encryption / Decryption - Encryption and decryption are cryptographic processes used to secure data during transmission and storage. While they are highly recommended for ensuring the confidentiality and integrity of sensitive information.
👉whether to use encryption or not depends on what you're sharing. For regular stuff, it's like talking openly, but for secrets or important info, it's like using a secret code to keep things safe. That's why encryption comes under an optional functionality - you decide when it's needed based on how important or private your information is.
Checkpoint✅ - Think of a checkpoint like a bookmark in a book 📔. When you reach a checkpoint, you're marking a specific point in your progress. In the world of computers and data, a checkpoint is a similar idea – it's a specific moment in time that's marked, usually for reference or backup.
👉Imagine you're writing a story on your computer. Every now and then, you decide to create a checkpoint. This is like saving a special version of your story at a particular point. Why?
👉Let's say you're happily typing away, and suddenly the computer crashes or you accidentally delete a big chunk of your story. Uh-oh! But if you have a checkpoint, you can go back to that saved version where everything was fine, and you don't lose too much work.
👉So, a checkpoint in terms of data is a way to save a specific point in your work, just like saving different versions of your story. It's like a safety net, helping you recover if something unexpected happens.
⭐Mandatory functionalities and their algorithms are already operational; their code is present in the kernel of our operating system. if we include the code for optional functionalities in the operating system, the system's complexity will increase, which is not beneficial for its performance. Optional functionalities are not mandatory; some systems may require these features, while others may not.
⭐While I've highlighted only a few functionalities, it's important to note that a protocol encompasses over 70 functionalities. These functionalities, both optional and mandatory, contribute to the comprehensive capabilities of the protocol in managing communication processes effectively
Why do we need the OSI model?
⭐because it consolidates over 70 functionalities, combining both optional and mandatory functionalities, into a standardized framework.
⭐All these functionalities are organized into OSI model layers. When a sender or client machine sends a message, the data goes through these layers sequentially. Similarly, at the receiver end, the data goes through the layers.
⭐In this series we will cover two networking models—the OSI model and the TCP/IP model.
Introduction to the OSI model 🚀
The Open Systems Interconnection model provides a standardized framework with seven layers, and each layer has specific responsibilities.
Each layer serves a specific purpose and interacts with adjacent layers, providing a structured approach to designing and understanding network communication. The seven layers of the OSI model, from the bottom to the top, are Physical, Data Link, Network, Transport, Session, Presentation, and Application. Each layer has its own set of protocols and functionalities, contributing to the efficiency and interoperability of network systems.
In the upcoming episodes, each layer of the OSI model will be explored in great detail 🤩
If you like my work, you can buy me a coffee and share your thoughts https://www.buymeacoffee.com/yashika227x