0 Members and 1 Guest are viewing this topic.
... Digital devices have, as you know, two statements - 0 or 1. And for example if u try to create a model of neuron, signals of with is analog, u will find a big trouble with it.
One of my university's professor tell me that future is for analog electronic, especially in medicine. Digital devices have, as you know, two statements - 0 or 1. And for example if u try to create a model of neuron, signals of with is analog, u will find a big trouble with it.
Nonetheless, I would advise anyone starting out in the industry to get a good understanding of all aspects of analogue design and, more fundamentally, the basics of electronics and the physics of semiconductors. These skills will always be in demand. The pressures in the digital design arena are to de-skill the detailed design process and move it to a higher level of system design. This is a challenging area in itself, but is more akin to programming and systems analysis. It depends on where your interests lie.
Hmmm... editing doesn't seem to be working - edits don't stick. Mods - can you help?I corrected my typos above twice, and added another paragraph, now lost :-(
I said essentially that the losses due to "bit rate reduction" are not inherent to digital (don't blame 'digital tech') but are a conscious choice by the broadcaster or person encoding the file to a lossy format. We could have superb TV pictures, if they chose to only broadcast 1/3rd as many stations...
A friend of mine was trying out Netflix yesterday.It was fine when we ran it through a little fuzzy 14" CRT TV.When we connected it to a digital TV, what became obvious was that during slow scenes, the image quality was excellent. However, during intense action sequences, the lossy format caused unacceptable pixelation.Of course, it could have also been a hardware issue, so we'll do more testing shortly.