I still often encounter people with a Master's degree in Computer Science, Information Technology or whatever, who are not used to think bits and bytes. The world of computing is huge, and I fully understand that the universities need to filter and carefully select what topics they should teach their students. However, there is a huge number of topics where knowledge about bit manipulation is important: IT security, resource leaks, format compatibility, communication protocols, low level APIs, data conversion, and even such thing as understanding UTF-8 used in XML, and many other places. There almost isn't a problem that cannot be understood by looking at it as bits and bytes.
Therefore, my advice to Computer Science students is this: Learn and understand bits and bytes, addressing, data formats (including IEEE 754!), binary operators and pointers and offsets. Don't consider yourself a good programmer before you know these by heart.
Your point is well taken, Lars. In addition, there are many succcessful developers who do not have a formal computer science degree, and therefore may have had limited formal training on bits and bytes.
Can you suggest some books or other resources for these developers that will give them a start on their bit and byte education?
Sorry, but I learned the fundamentals doing Intel Z80A machine code in the early 1980s, using bad photocopies from the Intel CPU documentation, so I am not in the position where I can recommend any available literature.
I have seen some students' books that cover the topics, but where the owners did not spend a lot of energy on it. Maybe it is a question of getting the right focus.
Cary, as Lars, i got my knowledge from the Z80/80805/6802 instruction sets - in conjunction with basic TTL circuitry building (Flip flops, Sync/Async up/down counters, shift registers, addders, gray code counters etc.) - so my best (but not very specific) guess would be to find some basic digital electronics books - even though TTL circuitry is dead, you still need this knowledge using ASICs and the like. But acually writing a chunk of working assembly code for any processor/controller would most likely also get you there.
Jan, I don't agree that ASICS, CMOS design etc. are necessary knowledge to programmers, but I agree that learning assembly code is the best way forward. That will also help in understanding problems with hyperthreading, cache systems, different memory types etc., all of which will require the high level programmers to adopt certain programming styles. It doesn't make sense to parallelize some code if the two threads kill each other's caches, and that's difficult to understand unless you have understood how a CPU works.
Lars, I think you misunderstood what I said (or i wasn't clear enough :o)
What I meant was that bit/byte knowledge is necessary to use ASICs etc - Not that the electronics stuff is necessary for programmers in general.
And the basic digital electronics books will cover bits,bytes and their operators, boolean algebra (Karnaugh mapping etc.) in depth, and the later chapters will cover Microprocessors and their ALUs, registers,stack handling, addressing, memory mapping etc. (not really bit/byte stuff - but also important knowledge)
Post a Comment