Naked Science Forum

Non Life Sciences => Technology => Topic started by: Jason Fong on 10/09/2009 08:30:02

Title: How efficient is binary?
Post by: Jason Fong on 10/09/2009 08:30:02
Jason Fong asked the Naked Scientists:
   
Hi Chris,

As we all know the standard qwerty keyboard layout is rather inefficient, error prone and slow. After all in school we learn that it was designed to slow down "typewritists".

After stumbling into the dvorak keyboard layout whilst trying to increase my characters per minute typed, I suddenly began to question the efficiency of codes such as binary.

Is binary inherently the most efficient way of representing any set data using two states? Or is it just efficient in conveying a set number of characters?

Or should I be asking about the efficiency code which is written in binary?

Also would the complexity of computer code increase linearly or exponentially as we hopefully in the future mature into quantum computing?

Perhaps binary is only efficient for linear calculations, I read that quantum computing will allow multi threaded algorithms, such as the ones used to calculate large primes, to run much faster/efficiently.

What do you think?
Title: How efficient is binary?
Post by: lyner on 10/09/2009 17:15:46
It all depends upon why you mean by "efficient".
Do you mean the length of code needed to represent a certain amount of information?
Do you mean how quickly a computer can perform a given basic operation?
Do you mean how quickly can you write a program to do something.
If the last is what you mean then all I can say is
10 Print "Hello World"
(Younger readers may need to talk to their grandmothers for an explanation of that)
which proves that binary is not too efficient for at least one purpose.
Title: How efficient is binary?
Post by: syhprum on 11/09/2009 04:25:06
Unfortunatly my grandmother (born 1885) is not available to consult, please explain.
Is it a statment in 'Basic' ?
Title: How efficient is binary?
Post by: LeeE on 11/09/2009 15:39:01
Yes, it depends upon what you mean by 'efficient'.

In any numbering system there's a trade-off between the number of symbols that you use and how many of those symbols are needed to express a value.  With the decimal system we have ten symbols, so we can represent ten different values with just a single symbol whereas with binary we have only two symbols, so we can only represent two values with a single symbol i.e. 0 & 1.

As I pointed out in another recent thread (after initially getting it wrong - Oops  [;)]) you can count up to 1023 using your fingers and thumbs if you do it in binary, rather than just 10 if you use decimal.
Title: How efficient is binary?
Post by: graham.d on 11/09/2009 16:17:39
I have seen efficiency described by the number of positions of a matrix. For example, to describe 1 million - 1 (=10^6 - 1) in the decimal system there would be a 6 x 10 matrix. That is 6 positions (units, tens, hundreds etc) by 10 states (0,1,2,3,4,5,6,7,8,9). Binary is a smaller matrix. To avoid fractions a slightly larger number 2^20 - 1 (=1,048,576) is represented by a 20 x 2 matrix. But a number base of 3 is the most efficient with 13 x 3 matrix giving 3^13 - 1 (= 1,594,323).

I believe this can be proved with more mathematical rigor than this example, and, as is the nature with mathematical abstractions, it is actually possible to show the most efficient number base is "e" (2.718281828..). Though it is hard to see what this means as a number base. I remember learning about this some time ago. It's a shame that ternary logic never caught on :-)
Title: How efficient is binary?
Post by: syhprum on 11/09/2009 16:47:17
I think ENIAC was the only large computer to use base 10, this was done to reduce the number of vacuum tubes needed to store data all later computers seem to have used base 2 as better ways of storing data were devised.
Vacuum tube circuits were devised to work in base 3 but as far as I know know no large computer ever used them.
Title: How efficient is binary?
Post by: lyner on 11/09/2009 17:44:04
Morse code is an example of a fairly efficient code for English text. Unlike ASCII, with fixed length symbols, it uses short symbols for common letters and longer symbols for uncommon letters. It is an example of what is called Redundancy Codes. Fiddly to code and decode but very efficient in the use of bits.

LeeE
I'm desperately trying to find some fallacy in your thing about fingers and counting. There's something suspiscious about it but I can't think what. I.e. it may be true but not relevant to the notion of efficiency,
Title: How efficient is binary?
Post by: Geezer on 11/09/2009 19:38:51
LeeE must be quite correct, certainly in terms of fingers (assuming each finger only has two possible states of course) because there can be no more than 1024 unique patterns of ten fingers. All possible combinations are used, from 0000000000 to 1111111111

Other counting methods (using fingers or anything else where only two states are allowed) fail to make use of all possible combinations.

For this reason, as long as a "bit" can only have two states, binary representation will always be the most efficient in terms of bits. And bits don't come for free in hardware terms, so you would like to use as few as possible.

BTW, here's a link to a recent thread on the same subject.

http://www.thenakedscientists.com/forum/index.php?topic=24968.0
Title: How efficient is binary?
Post by: lyner on 11/09/2009 21:43:49
When using fingers to count, I'm not sure it's the binary state of fingers that counts as much as the position of the raised finger. Or is that just another way of looking at it?
Title: How efficient is binary?
Post by: Geezer on 11/09/2009 22:15:36
Traditionally we only raise one finger at a time to count (although Churchill did make it popular to raise two - something to do with Agincourt I think). However, that's a very inefficient method of counting in terms of fingers. By assigning the values 1, 2, 4, 8, 16 etc. to particular fingers we can encode any number between 0000000000 and 1111111111 if you like to count in binary or 0 and 1023 if you prefer decimal notation.

As someone once pointed out, this is all rather basic "book work" to those who know anything about digital systems.

Title: How efficient is binary?
Post by: lyner on 12/09/2009 10:30:16
I think you have to look at the two hands as, effectively, ten different symbols in a decimal system.
In which case, you would need about three such symbols to transmit 10E3 different states which is roughly the same number of states that can be sent with ten binary digits. (This is a fairly common computer thing - as in kB often meaning 1024Bytes.)

I must say, I hadn't imagined, in my remotest dreams that "raised finger" could be taken in that way but, on re-reading, I see that it could have been.

Title: How efficient is binary?
Post by: DoctorBeaver on 12/09/2009 22:36:05
I like the ancient Babylonian method of counting on one's fingers. Forgetting the thumb, there are 3 segments to each finger. That means you can count to twelve on 1 hand. When you reach 12 you mark 1 on your other hand. Starting with the tip of the left index finger you check all 3 segments on that finger, then move to the middle finger & so on.

Thumbs were used to touch the relevant finger segments so touching the 2nd segment of the middle finger, left hand and 3rd segment index finger of the right hand you are indicating 3*12+5=41. The max number you can count to is 144 (assuming you have 4 fingers on each hand).

12 is a more efficient base than 10 as 12 is divisible by 2, 3, 4 & 6. 10 is only divisible by 2 & 5. Common fractions are readliy calculable in base 12 too. You have use decimal places for any fractions other than 1/2 or 1/5. Base 12 allows 1/4, 3/4, 2/3 etc. I sometimes wish the Babylonian system had come to the fore rather than the base 10 system we now use.
Title: How efficient is binary?
Post by: Geezer on 13/09/2009 05:41:00
I like the ancient Babylonian method of counting on one's fingers. Forgetting the thumb, there are 3 segments to each finger. That means you can count to twelve on 1 hand. When you reach 12 you mark 1 on your other hand. Starting with the tip of the left index finger you check all 3 segments on that finger, then move to the middle finger & so on.

Thumbs were used to touch the relevant finger segments so touching the 2nd segment of the middle finger, left hand and 3rd segment index finger of the right hand you are indicating 3*12+5=41. The max number you can count to is 144 (assuming you have 4 fingers on each hand).

12 is a more efficient base than 10 as 12 is divisible by 2, 3, 4 & 6. 10 is only divisible by 2 & 5. Common fractions are readliy calculable in base 12 too. You have use decimal places for any fractions other than 1/2 or 1/5. Base 12 allows 1/4, 3/4, 2/3 etc. I sometimes wish the Babylonian system had come to the fore rather than the base 10 system we now use.

Ah ha! I think that might explain why Ancient Babylonians, Inc. had to announce yet another delay in shipments of their radical new base 12 architecture microprocessor. However, I just heard from their product manager and he is extremely confident they will actually release a Beta version of the chip during Q1, FY2010, only 3217 years behind schedule.
Title: How efficient is binary?
Post by: Geezer on 13/09/2009 07:16:17
I think you have to look at the two hands as, effectively, ten different symbols in a decimal system.
In which case, you would need about three such symbols to transmit 10E3 different states which is roughly the same number of states that can be sent with ten binary digits. (This is a fairly common computer thing - as in kB often meaning 1024Bytes.)

I must say, I hadn't imagined, in my remotest dreams that "raised finger" could be taken in that way but, on re-reading, I see that it could have been.

Sophiecentaur - Let's assume, for a moment, that you are sufficiently naive to have missed the implications of your "raised finger" observation, but what on earth are you babbling on about regarding binary systems? Are you suggesting that all the scientists and engineers in the computer industry are a bunch of twits who don't know what they are talking about?

Are you proposing a better system? If you have one, I'm confident you will receive a Nobel Prize if you reveal it, and I will happily support you if that is the case.

Sadly, I think you are just waffling because you can't stand the idea that others might know more about a subject than you ever will, but I'll be delighted if you can persuade me that my perceptions are misguided.
Title: How efficient is binary?
Post by: Geezer on 13/09/2009 09:14:17
SC - Recently, I happened upon a post from an undergrad who was obviously struggling with his course work and came to this site to seek help. Perhaps he was not applying himself sufficiently, but perhaps he was a brilliant technologist who could not relate to the teaching method at his school. I have no idea which it was.

However, your lack of knowledge of his situation did not deter you from pointing out that, if he wanted to get a degree, he better get into a library and figure it out for himself. I think you made some insightful comment along the lines of "That's why they call it a degree."

I'm having a little difficulty finding the particular post, but I'm sure you will remember it and be able to point us to the link.

Many thanks.

Geez
Title: How efficient is binary?
Post by: Make it Lady on 13/09/2009 15:42:23
Is no one going to answer Jason's question in a sane ans sensible manor. I think he has probably gone running off to the medicine cupboard for some aspirin after reading this load of tripe. Jason, I wish I could help you. Perhaps I'll ask my 15 year old son as he might have a more mature reply.
Title: How efficient is binary?
Post by: Geezer on 13/09/2009 20:44:15
But Miss, he started it!
No I didn't. You did!
Didn't!
Did!
Swot - Ouch!
Waaa! He hit me Miss.

OK Miss, fair point. We are not being very considerate to our guest.

Jason is asking several questions. Perhaps we should try to identify them:

1. Are there more efficient human interfaces for data and text entry?
2. Are there more efficient means of representing values (in a computer) than binary numbers?
3. Would it be more efficient to represent computer code using something other than binary representation?
4. Will the amount of code required with quantum computing be much greater than the amount of code required with today's linear programming technique?
5. Will quantum computing allow us to calculate some things much more quickly?

(I hope the above captures the essence of Jason's question.)

Here's an attempt to answer them.

1. As Jason already pointed out, there are more efficient keyboard layouts than QWERTY. It's possible to do a lot of things to speed up user input. For example, it's possible to use stenographer type keyboards which are more like playing chords on a piano. Unfortunately, QUERTY is so firmly entrenched that's is unlikely to be replaced any time soon.

2. I'm not sure if I got the question right exactly, but assuming I did, and assuming Jason is referring to computers that employ binary (i.e. two state) logic, then binary representation, storage and manipulation of numeric data is the most efficient method, certainly in terms of computer hardware utilization because there are no unused states. It might seem like a pain to do everything in binary, but that's a human perspective because we (typically) count using bits that have ten states rather than two. Although it is possible to build computers that count using other bases, because the computer's logic is still all binary, it only adds complexity without improving performance. I don't think Jason was proposing anything other than two state hardware. If he was, then the answer would be very different.

3. The code that the computer executes is a mixture of op-codes and data. The op-codes are really just patterns that the processor recognises and acts upon. The associated data can be almost anything in the context of the op-codes. Again, because the logic is two state, the objective is to use all the available bits as efficiently as possible. If for example, we only permitted binary-coded-decimal (BCD)representation "inside" the computer, we would not be making full use of all the available bits, so, for a given set of requirements, the BCD machine would have wider words and more memory than its plain binary counterpart.

4. I'm rather clueless about that one. I suspect the code does not expand because multiple instances of the code are executing simultaneously, but that's only a guess.

5. I hope so, otherwise we should not waste our time developing such a thing  [:D]
Title: How efficient is binary?
Post by: graham.d on 14/09/2009 12:44:44
Make it Lady, I did give a sensible, and I believe mathematically correct, answer but it was obviously considered too boring to bother with.
Title: How efficient is binary?
Post by: Geezer on 14/09/2009 19:22:38
Make it Lady, I did give a sensible, and I believe mathematically correct, answer but it was obviously considered too boring to bother with.
Certainly not boring, but possibly too erudite for a Geezer!
Would your scheme require three state logic, or could it be implemented with two state logic?
Title: How efficient is binary?
Post by: LeeE on 14/09/2009 19:36:54
You'd need three-state logic for trinary, four-state logic to work in base four etc.
Title: How efficient is binary?
Post by: Geezer on 14/09/2009 20:03:34
You'd need three-state logic for trinary, four-state logic to work in base four etc.

Agreed - unless you used two bits for three states, which would be very inefficient. I suppose two bits for four states would be OK, but I'm not sure why it would be any better than binary.

Jason's question was;
"Is binary inherently the most efficient way of representing any set data using two states?"

I think by "binary" he is referring to the binary representation of numbers (presumably including positive and negative values) and "two states" refers to two state (i.e., binary!) logic. So, if we are limited to two states (per bit) then yes, I would say that binary representation is as good as it gets.
Title: How efficient is binary?
Post by: graham.d on 15/09/2009 15:20:50
Actually three states is more efficient as a number system, but two states is easier to implement in logic because logic is based on something being true or false - a natural two state system.
Title: How efficient is binary?
Post by: Geezer on 15/09/2009 16:40:01
Actually three states is more efficient as a number system, but two states is easier to implement in logic because logic is based on something being true or false - a natural two state system.

I'll take your word for that [:)]

Wonder what a trinary J-K flip-flop would look like?
Title: How efficient is binary?
Post by: LeeE on 15/09/2009 18:13:12
Umm... "a trinary J-K flip-flop..."

Do you want to think that one through again?  [;D]

You can't really use several bits i.e. two bits to work in trinary because it's not just a question of how you store the data but also how you operate on that data.  Even though you could use two bits to represent trinary or base four, the underlying operations will still be binary because the logic is still two-state.  In trinary you need to not only work with tits, rather than bits, but also with three-state logic, hence my comment re a trinary flip-flop.  For example, if we use -1, 0 & +1 for our trinary states, what would be the opposite of each state?
Title: How efficient is binary?
Post by: Geezer on 15/09/2009 19:28:01
Umm... "a trinary J-K flip-flop..."

Do you want to think that one through again?  [;D]

You can't really use several bits i.e. two bits to work in trinary because it's not just a question of how you store the data but also how you operate on that data.  Even though you could use two bits to represent trinary or base four, the underlying operations will still be binary because the logic is still two-state.  In trinary you need to not only work with tits, rather than bits, but also with three-state logic, hence my comment re a trinary flip-flop.  For example, if we use -1, 0 & +1 for our trinary states, what would be the opposite of each state?
I was in the shower why I realized the folly of my question! It wouldn't be a flip-flop. It would have to be a trip-flop  [:D]
It ought to be possible to define a set of logic elements that would allow the construction of a trinary computer, but my brain is so "hard-wired" for binary, that I'm having a hard time imagining what they would possibly be. For example, what would be the equivalent of a simple binary inverter? It makes no sense to me in trinary terms. As you point out, there is no such thing as "opposite" (I think!)_
Title: How efficient is binary?
Post by: Mr. Scientist on 16/09/2009 01:24:20
Jason Fong asked the Naked Scientists:
   
Hi Chris,

As we all know the standard qwerty keyboard layout is rather inefficient, error prone and slow. After all in school we learn that it was designed to slow down "typewritists".

After stumbling into the dvorak keyboard layout whilst trying to increase my characters per minute typed, I suddenly began to question the efficiency of codes such as binary.

Is binary inherently the most efficient way of representing any set data using two states? Or is it just efficient in conveying a set number of characters?

Or should I be asking about the efficiency code which is written in binary?

Also would the complexity of computer code increase linearly or exponentially as we hopefully in the future mature into quantum computing?

Perhaps binary is only efficient for linear calculations, I read that quantum computing will allow multi threaded algorithms, such as the ones used to calculate large primes, to run much faster/efficiently.

What do you think?

It  is very essential when loging valuated matrices. In fact, the entire universe can actually be composed of zero's and one's - but this would be a universe in it's most simplistic format without the need of the complexities. However - i believe the universe will not allow us to understand such complexities, or reduce it so easily. Concerning computers... well, it's necessery. Even when qubic forms are produced as a superpositioning.
Title: How efficient is binary?
Post by: Mr. Scientist on 16/09/2009 02:41:29
http://www.oilanalysis.com/article_detail.asp?articleid=597

Please don't tell me this is vernon from the site here?

 [V]
Title: How efficient is binary?
Post by: graham.d on 16/09/2009 08:56:28
Ternary can be used in computing though we are not really used to thinking in this way. I have a suspician that it would have been quite hard to think of the design of a multi-bit adder in binary for the first time (a cascade of full adders with carry) and 2's complement arithmatic is not all that obvious. The problem with ternary is that detecting 2 states (high or low) is easy in electronics but a third state would demand some sort of window comparator or, alternatively, running 2 lines from every output. So basically binary is well suited to mapping on to the hardware and efficient in terms of silicon real estate.
Title: How efficient is binary?
Post by: LeeE on 16/09/2009 12:17:33
Lol  [:D] - it's only just occurred to me to check wikipedia...

http://en.wikipedia.org/wiki/Ternary_computer (http://en.wikipedia.org/wiki/Ternary_computer)

Title: How efficient is binary?
Post by: Geezer on 16/09/2009 16:35:37
Groan! I found the ternary links on Wiki last night too, but I was trying to keep it quiet  [:)]

Here's an idea: You can build any arbitrarily complex binary logic system from one simple logic element, for example, a two input NAND gate.

Sooooo, "all" we have to do is figure out the ternary/trinary equivalent of the simple nand gate, and we ought to be able to design any logic system we want - piece of cake!
Title: How efficient is binary?
Post by: Geezer on 17/09/2009 03:03:16
Ternary can be used in computing though we are not really used to thinking in this way. I have a suspician that it would have been quite hard to think of the design of a multi-bit adder in binary for the first time (a cascade of full adders with carry) and 2's complement arithmatic is not all that obvious. The problem with ternary is that detecting 2 states (high or low) is easy in electronics but a third state would demand some sort of window comparator or, alternatively, running 2 lines from every output. So basically binary is well suited to mapping on to the hardware and efficient in terms of silicon real estate.

GrahamD,
The electronics would be rather challenging as you say, but I suppose it could be done if there was sufficient incentive. I can pretend to myself that I can almost understand how arithmetic operations might be carried out, perhaps because it's not difficult to check that the result of any arithmetic operation is correct. What totally defeats me are many of the logical operations. I think we will need a whole new set of tools because things like K-Maps assume binary states. For example, what would an XOR even mean in trinary? We would need to define some new rules for trinary logic.

Now here's the scary part:

It is necessary for me to reduce logic designs to rather fundamental terms, but that is only because I was trained to do that. At one time, logic gates were quite expensive, so I was conditioned to optimize a design to minimize the number of gates used. Not so the digital engineers of today. They don't think in terms of gates. They don't think in hardware terms at all. They think in terms of expressions and data transfers. These days, digital systems are expressed entirely in software terms.

Today, the only things that really matter are:
How large is the die? (chip size)
How much power does it consume?

So, assuming the hardware guys can devise a robust trinary implementation, the tool designers could create a set of tools that will allow the system designers to produce systems that operates on trinary, but nobody would even know the difference.

(Except the poor guys and gals who have to figure out what's going on when it does not work as expected.)
Title: How efficient is binary?
Post by: graham.d on 17/09/2009 11:10:34
Yes, Geezer, you are right. It would mean a lot of hard work for the CAD vendors though. Cadence, Mentor, Synopsys, Magma etc would have to do a lot of work for their money :-) Perhaps there would have to be a new expanded set of RTL instructions too. Logical operations could still be binary but arithmetic could be ternary. I can't see it catching on very quickly, though. In theory, it is about 50% more efficient in terms of states to be saved but the structures needed to save those states are probably more complex by a higher percentage.

Actually logical operations can be mapped on to a ternary (or any other) number system though it is not obvious because we tend to think of single logic gates. You have to consider multi-bit in (and multi-bit out) clouds of logic and then there can be a more efficient mapping on to ternary. I think this is where I would let a computer synthesise it :-)

Title: How efficient is binary?
Post by: LeeE on 17/09/2009 12:49:40
I was wondering about how this could be implemented in 'silicon' (but I fear I may only know enough to make a fool of myself here).

The wikipedia article on Computer Memory (Random-access memory):

http://en.wikipedia.org/wiki/Random-access_memory (http://en.wikipedia.org/wiki/Random-access_memory)

says...

Quote
Modern types of writable RAM generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), or as a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), EPROM, EEPROM and Flash.

Hmm... I'm not sure about how using charge in a capacitor is implemented in practice, but both the flip-flop SRAM and transistor gate DRAM methods are both inherently binary devices.  A tri-state flip-flop, as discussed earlier, seems to be a bit of a contradiction in terms, and I think we might need a sort of combined pnp/npn transistor to use the gate method.  It would seem then, that the individual low-level hardware elements that would be needed to implement a trinary hardware system would be more complex, as graham.d says.

Heh, while the idea of trinary seems relatively straightforward, at first glance, actually implementing it might be a lot trickier that you'd expect.  It makes me wonder how significant it is that no one appears to have made a trinary computer system since that 1950's Russian one; if it were easy, then I think it would have happened but as it hasn't, then it probably isn't.
Title: How efficient is binary?
Post by: Geezer on 17/09/2009 16:28:38
I was wondering about how this could be implemented in 'silicon' (but I fear I may only know enough to make a fool of myself here).

The wikipedia article on Computer Memory (Random-access memory):

http://en.wikipedia.org/wiki/Random-access_memory (http://en.wikipedia.org/wiki/Random-access_memory)

says...

Quote
Modern types of writable RAM generally store a bit of data in either the state of a flip-flop, as in SRAM (static RAM), or as a charge in a capacitor (or transistor gate), as in DRAM (dynamic RAM), EPROM, EEPROM and Flash.

Hmm... I'm not sure about how using charge in a capacitor is implemented in practice, but both the flip-flop SRAM and transistor gate DRAM methods are both inherently binary devices.  A tri-state flip-flop, as discussed earlier, seems to be a bit of a contradiction in terms, and I think we might need a sort of combined pnp/npn transistor to use the gate method.  It would seem then, that the individual low-level hardware elements that would be needed to implement a trinary hardware system would be more complex, as graham.d says.

Heh, while the idea of trinary seems relatively straightforward, at first glance, actually implementing it might be a lot trickier that you'd expect.  It makes me wonder how significant it is that no one appears to have made a trinary computer system since that 1950's Russian one; if it were easy, then I think it would have happened but as it hasn't, then it probably isn't.
I think you are quite correct. The hardware of today tends very much towards the binary. I suppose it might be possible to store positive or negative charge to store three states (pos, neg or zero) but I suspect it would be more economical to "fake it" using two bits!
It would not surprise me to learn the the Russian guy went off his rocker and had to be institutionalized. I'm pretty sure that's what's going to happen to me if I keep thinking about this topic  [;D]
Title: How efficient is binary?
Post by: Make it Lady on 17/09/2009 19:38:15
Thanks more like it boys. A sweety for all of you. I'm sure Jason is smiling.
Title: How efficient is binary?
Post by: Geezer on 18/09/2009 04:38:09
Thanks Miss!

You're a sport. You can play with my conkers any time.
Title: How efficient is binary?
Post by: graham.d on 18/09/2009 08:56:21
Geezer and Lee, actually modern Flash memories already use multiple levels (see MLC Flash). Of course it is hard to get a high yield so there is a huge amount of error correction used but it saves a lot of chip area. It has been tried in Dynamic memories too but without much commercial success. It is hard to see how to use it in SRAM (more or less by definition). The multiple levels tend to be binary related (4, 8, 16), but this is because of the logic following will be binary. There is nothing to prevent 3 or 9 states being used.
Title: How efficient is binary?
Post by: Geezer on 18/09/2009 17:40:44
Geezer and Lee, actually modern Flash memories already use multiple levels (see MLC Flash). Of course it is hard to get a high yield so there is a huge amount of error correction used but it saves a lot of chip area. It has been tried in Dynamic memories too but without much commercial success. It is hard to see how to use it in SRAM (more or less by definition). The multiple levels tend to be binary related (4, 8, 16), but this is because of the logic following will be binary. There is nothing to prevent 3 or 9 states being used.
Good point!
It struck me that it would not be too difficult to encode three states efficiently on rotating magnetic media too. The biggest challenge may be designing an efficient and reliable trip-flop.
Title: How efficient is binary?
Post by: LeeE on 18/09/2009 20:49:47
graham.d: Thanks for the pointer to MLC Flash.  Wikipedia reckons that in practice, each cell is limited to four states, corresponding to two bits, but this does seem to be due to manufacturing and yield issues, as you suggest.

It looks like one of the main limitations is the dynamic range that a cell can cope with.  The degree of separation between levels seems to be a key factor in avoiding errors, so with a wider dynamic range, you could use more levels while maintaining the same absolute separation, or possibly run them much faster for the same density.

Have you any idea if they more susceptible to induced noise?  I'm wondering if there's a risk that induced noise could change a charge level, which would be problematic.  Yup, interesting stuff.
Title: How efficient is binary?
Post by: graham.d on 19/09/2009 21:43:23
There are experimental devices with 16 levels but they are very susceptible to errors and need a lot of error correction. It is also very difficult to discriminate the amount of charge stored. Even 4 levels is very challenging. There will be random noise superimposed upon the sense amplifiers when they make decisions about the amount of charge and this gets more significant as the differences between the levels is smaller. This contributes to the number of "soft" errors and the error correction software has to be sufficient to cope with it. This involves storing many more bits and having quite sophisticated error correction logic. Another interesting feature necessary for the life of Flash memories is the extensive hardware and sophisticated algorithms needed for "wear levelling". This is to spread the usage of the memory as evenly as possible as writing the same cells, as many programs would do, will cause them to fail in quite a short time. This technique extends the life considerably.