Giga means a billion (1,000,000,000)
Tera means a trillion (1,000,000,000,000)
In CPU speed (its frequency)these mean the frequency at which the CPUs run cycles per second 1 MHz (Megahertz) = 1 million hertz(or 1 million cycles per second), , 1Ghz (Gigahertz) = 1 billion hertz or 1 billion cycles per second as 1 Hz = 1 cycle per second. Also RAM and motherboard's FSB (front side bus)have its frequencies in these units.
In data storage units these can be megabits,gigabits,terabits etc. and megabytes,gigabytes,terabytes etc. 1 Megabit = 1 million bits, 1 Megabyte = 1 million bytes and so on with giga and tera but replace it with billion and trillion respectively.
However in practicality in computing sense, 1 megabyte is not exactly 1 million bytes, rather it is 1024 x 1024 bytes. That is 1,048,576 bytes.
Also 1 gigabyte is not exactly 1 billion bytes. It is 1024 x 1024 x 1024 bytes = 1,073,741,824 bytes
However if you just consider the terms Mega, Giga and Tera without any suffix (bits, bytes etc) then actually Mega is exact 1 million, giga is 1 billion and tera is 1 trillion.What is the difference between mega, giga and tera in computers?
hahahahahaha. go back to math class
Do some research,
a good place to start is http://en.wikipedia.org/wiki/What is the difference between mega, giga and tera in computers?
Mega = x 10^6
Giga = x 10^9
Tera = x 10^12
Can be hertz (cycles per second), can be bytes of storage, either RAM or ROM.
It's just multiplier. It means the same in computers as it does anything else.
1 mega x 1024 gives giga, 1 giga x 1024 gives tera.
In computers 1 kilo is 1024 and 1 kilo x 1024 gives mega.What is the difference between mega, giga and tera in computers?
If my Memory serves me right
1000 kb = 1 megabyte
1000 mb= 1 gigabyte
1000 gb=1 terabyte
No comments:
Post a Comment