Concurrency is the art of managing multiple tasks in overlapping time periods, creating the illusion that they are all progressing simultaneously. The key phrase here is "creating the illusion that they are all progressing simultaneously." Concurrency doesn't necessarily mean that these tasks are running simultaneously or in parallel. Instead, it focuses on efficiently managing and scheduling tasks so that they can make progress independently without waiting for each other.
Multithreading involves the literal use of multiple threads. These threads can make progress without waiting for other threads to complete their tasks. Even in a single-core machine, multithreading is typically achieved through rapid context switching, where the CPU scheduler switches between different threads, giving each a time slice for execution. This allows the appearance of concurrent execution, even though only one thread is executing at any given moment. So, multithreading is a type of concurrency.
Asynchronous programming pertains to handling time-consuming operations, especially those involving I/O (Input/Output) tasks, without blocking the main thread of execution. It permits a program to initiate a task and then continue executing other tasks without waiting for the initiated task to complete. When the initiated task finishes, a callback or event signals its completion, enabling the program to respond accordingly. In the context of GUI applications, asynchronous programming ensures responsiveness by freeing up the UI thread. In server-side applications, it fosters scalability by freeing up request threads. Asynchronous programming is a form of concurrency.
Parallelism involves dividing the work among multiple threads, each capable of running independently on a different core in a multi-core system.