Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How did the term "asynchronously" come to mean ...

  1. Dec 11, 2016 #1
    ... the exact opposite of its dictionary meaning?

    a means not
    synchronous
    means happening, existing, or arising precisely at the same time (https://www.merriam-webster.com/dictionary/synchronous)

    So if I have two lines like

    Code (Text):

    double d = Math.Sqrt(5.0);
    string s = new String('f', 1239);
     
    that run in sequence, meaning the 2nd line starts executing when the 1st is finished, then they ran not in-sync.
     
  2. jcsd
  3. Dec 11, 2016 #2

    Svein

    User Avatar
    Science Advisor

    In electronics, synchronously has come to mean "controlled by a common clock". Sadly, asynchronously has come to mean several slightly different things based on "not controlled by a common clock". So, asynchronously can mean
    • Not controlled by any clock
    • Controlled by several independent clocks
    Both concepts are tricky to implement...
     
  4. Dec 11, 2016 #3

    Filip Larsen

    User Avatar
    Gold Member

    In computer science asynchronous sort of means out of sync, i.e. timewise uncoordinated.

    So your two lines can be said to execute synchronous because their order and timing of execution are coordinated: the second statement will not execute until after the first statement has completed. If the statements were executed in parallel (by some mechanisms provided by the language and environment of choice) they would execute asynchronously with respect to each other.
     
  5. Dec 11, 2016 #4

    Mark44

    Staff: Mentor

    Actually, it doesn't, as already stated by Svein and Filip. Operations that are synchronous don't have to occur at the same time, but do occur in close temporal proximity. Asynchronous operations ("async") generally don't occur closely in time. Many of the classes in the .Net Framework have member methods of both (sync and async) types. With an async method, your code can continue operating, and not have to wait for the result from an async method.
     
  6. Dec 11, 2016 #5

    QuantumQuest

    User Avatar
    Gold Member

    The whole thing has to do with the reference (clock) for sync or out of sync, as noted above. In particular, for code execution, if your system has just one processor, the concept of time slicing of processes, controlled by the OS, makes processes that get to the CPU for execution seem synchronous from your point of view - because time slices run for a very short time and each subsequent slice of some process every very short time too, even though they are not running in parallel (pseudoparallelism). For the CPU clock, each block of time slices of processes sent for execution, are slices running in sync. If you have more than one processor and depending on the specific architecture, processes can run in parallel, so be both in sync from your point of view and for the coordinating clock.
     
  7. Dec 11, 2016 #6

    rcgldr

    User Avatar
    Homework Helper

    The term "asynchronous" dates back to the days of mainframes (1960's / 1970's), such as "asynchronous I/O". It has been somewhat changed / expanded in its meaning since those early days. I'm not sure why "asynchronous" was chosen to describe those type of operations.
     
  8. Dec 11, 2016 #7
    Do you have a source for this? I'm giving a talk on asyncronous programming and I want to include this fact if I can find a citation for it.
     
  9. Dec 12, 2016 #8

    rcgldr

    User Avatar
    Homework Helper

    Try a web search for "IBM asynchronous I/O" or "CDC asynchronous I/O" . You should find some links like the ones below. I'm not sure where or when the term asynchronous I/O started, but its usage dates back to the 1950s, along with "overlapped I/O". There may not be many hits for the CDC machines. The CDC 3000 and 6000 series computers had multiple DMA like processors called PPU (Peripheral Processor Unit). They could be used for I/O or to move memory and could be set to generate an interrupt at completion.

    Asynchronous (or overlapped) I/O implied usage back to the 1950s:

    http://people.cs.clemson.edu/~mark/io_hist.html

    "asynchronous events"

    http://en.wikipedia.org/wiki/IBM_System/360_architecture

    Wiki articles, but no dates:

    http://en.wikipedia.org/wiki/Asynchronous_I/O

    http://en.wikipedia.org/wiki/Overlapped_I/O

    Side note - a bit of trivia, although most CP/M systems (late 1970's) and all PC's (1981) had DMA, the original (1984) Macintosh did not include a DMA feature for SCSI (so no overlapped I/O), instead using a feature called "blind transfer": the code would poll until the first byte of a 512 byte transfer was ready, then perform a hard loop with no software handshake to transfer the remaining 511 bytes. There was a hardware handshake that paused the entire Mac as needed for each byte, with what I seem to recall was a 16 microsecond timeout that would fail the I/O, but allow the Mac to run again to do things like refresh DRAM. Third party vendors made proper DMA / SCSI controllers for later Macs. I don't recall when Macs first included DMA.
     
    Last edited: Dec 12, 2016
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted