Jump to user comments
unit A basic unit of computation, one period of a computer
takes a number of clock cycles. Often the
computer can access its memory once on every clock cycle, and
so one speaks also of "memory cycles".
describes himself as a "cycle junkie"). There are only so
many cycles per second, and when you are sharing a computer
the cycles get divided up among the users. The more cycles
the computer spends working on your program rather than
someone else's, the faster your program will run. That's why
every hacker wants more cycles: so he can spend less time
waiting for the computer to respond.
The use of the term "cycle" for a computer clock period can
probably be traced back to the rotation of a generator
generating alternating current though computers generally use
Interestingly, the earliest mechanical calculators,
which rotated in true cycles.