assembly - Importance of Hexadecimal numbers in Computer Science -


when studying programming 8085, 8086 , microporcessors in general have hexadecimal representation. ok binary numbers important in computers. how these hexadecimal numbers important? historical importance?

it nice if point historical papers also.

edit:

how computers handle hexadecimal numbers? example happens in 8085 when hexadecimal number given input?

hexadecimal has closer visual mapping various bytes used store number decimal does.

for example, can tell hexadecimal number 0x12345678 significant byte hold 0x12 , least significant byte hold 0x78. decimal equivalent of that, 305419896, tells nothing.

from historical perspective, it's worth mentioning octal more commonly used when working older computers employed different number of bits per word modern 16/32-bit computers. wikipedia article on octal:

octal became used in computing when systems such pdp-8, icl 1900 , ibm mainframes employed 12-bit, 24-bit or 36-bit words. octal ideal abbreviation of binary these machines because word size divisible three

as how computers handle hexadecimal numbers, time computer dealing it, original base used input number irrelevant. computer dealing bits , bytes.


Comments

Popular posts from this blog

c# - DetailsView in ASP.Net - How to add another column on the side/add a control in each row? -

javascript - firefox memory leak -

Trying to import CSV file to a SQL Server database using asp.net and c# - can't find what I'm missing -