When up other requests. No other request will

Topic: DesignConstruction
Sample donated:
Last updated: February 26, 2019

When building Node.

js applications, perhaps the most important concept to understand, and often misunderstood, is its concurrency model for handling I/O operations and the use of callback functions. Node.js is an event-driven, non-blocking, single-threaded JavaScript runtime environment built on Chrome’s V8 engine which allows it to execute JavaScript code on the server-side. In simple terms, everything that happens in Node.js is a reaction to an event.

Don't use plagiarized sources.
Get Your Custom Essay on "When up other requests. No other request will..."
For You For Only $13.90/page!


Get custom paper

 In Node’s architecture, this event-driven model is provided by a mechanism called the event loop. The event loop is what allows Node.js to handle high throughput operations and it is supplemented by a multi-platform C library called libuv(https://github.com/libuv/libuv).

When talking about I/O operations in Node.js, it usually refers to slow operations such as accessing external resources such as disks or network resources. These type of I/O operations are the most time-expensive as they take a longer time to complete.

Waiting for I/O to complete is the largest waste in most programming languages.## Handling I/O There are three popular ways to deal with slow I/O operations. The easiest way is to process requests **synchronously**, that is, handle one request at a time. But this is a bad approach because it will hold up other requests. No other request will be processed until the current request is completed. Another approach is to **fork a new process** to handle each new request. While this is an improvement over the first approach, this does not scale well for hundreds or thousands of requests.

Forking a new process is memory-expensive since resources will be allocated to each new process even when idle.The most popular approach to handling I/O requests is *threads*, In this approach, a new thread is created to handle each new request. While this approach uses less memory and resources, things can get complicated easily since you have to deal with managing shared resources.An example of a multithreaded approach is the Apache web server. Apache is multithreaded, it spawns a new thread for every new request.

If you have used Apache before, you can see how it consumes memory as the number of concurrent connection increases and more threads are created to handle new requests. On the other hand, Nginx, a popular alternative to Apache is single threaded. It doesn’t have a high memory usage even when it handles a large number of concurrent requests.Nginx, similar to Node.js, is single-threaded but event-driven and it also utilizes an event loop to handle multiple requests in a single thread.

The event loop is used to handle requests for slow I/O operations without blocking the main thread of execution. ## The Event LoopThe event loop is the mechanism that allows Node.js to perform non-blocking I/O operations, by offloading I/O operations to the OS (operating system) kernel whenever possible. Despite being single-threaded, Node.js takes advantage of the kernel being multithreaded which can handle multiple operations in the background.The event loop starts automatically whenever Node.js executes your JavaScript code.

This is what makes the asynchronous style of programming possible in Node.js. It is a semi-infinite loop that performs polling and blocking calls to the system kernel until one of the operations is completed.

At this point, the kernel will notify Node.js that the appropriate callback function may be added to its poll queue to be eventually executed. Node.js exits when it no longer has any events to process.

At this point, the event loop must complete and exit.  Contrary to popular event loop diagrams found on the web, the event loop in Node.js does not run through and process a stack. The event loop goes through a set of *phases* with specific tasks that are handled in a round-robin manner. Another common misconception is that asynchronous operations in Node are always loaded off to the thread pool provided by libuv. The truth is, it only uses the thread pool provided by libuv for asynchronous I/O only when there is no other way. As much as possible, the event loop will utilize first asynchronous interfaces available in the operating system before it uses its own.

## Phases OverviewUnderstanding what occurs in each phase of the event loop is the key to fully understand the event loop. The following are the phases of each run of the event loop:* Timers – callbacks scheduled by `setTimeout()` and `setInterval()` will be processed in this phase*  I/O callbacks – this is where callbacks from the user code will be processed* Poll – this is the phase of retrieving new I/O events. Node.js will also block when necessary* Set Immediate – where `setImmediate()` callbacks are processed* Close – where all `on(close)` callbacks are handledFor each run of the event loop, Node.js checks if there any asynchronous I/O or timers and shuts down if there is none.

 There’s a FIFO (first in first out) queue of callbacks for execution in each phase. When the event loop enters a phase, it will perform any operations specific to the current phase the event loop is in, until the queue has been exhausted or the maximum number of callbacks has been executed. When the queue of the current phase has been exhausted or reached the callback limit, the event loop will move to the next phase, and so on.

## SummaryThis has been a short overview of how Node.js event-driven concurrency model works. 

Choose your subject

x

Hi!
I'm Jessica!

Don't know how to start your paper? Worry no more! Get professional writing assistance from me.

Click here