# Nonlinearity makes people do strange things

Control engineers know that non-linearity causes problems – the rest of the world could maybe take note…

## What is non-linearity?

Non-linearity is where the response of something is out-of-proportion with the change that’s just been inflicted on it. If you tweak something and get a huge change all of a sudden, that’s non-linear. Light switches are non-linear – there’s a small change to the light switch, but the light changes abruptly. A dimmer switch is linear – you turn it a bit, the light changes a bit. Turn it a lot and the light turns a lot.

## Controlling

If you want to use that knob to control something, it’s an awful lot easier if the response is linear. Imagine trying to control the speed of a car if the throttle pedal responded non-linearly. You press the pedal more and more, the engine does nothing, and the suddenly as you cross some invisible point, the engine bursts to life and you scream ahead!

Control engineers (some of whom designed the engine management system in your car) take great pains to keep everything as smooth (ie linear) as possible.

## So what about the rest of life?

Now you’re wondering what the “rest of the world” I was talking about has to do with a relatively obscure engineering discipline?

Say you wanted to “control” something like the performance of your sales force. Or some money managers… Maybe you’d say “if you do this well, you get this much bonus”.

That’s a non-linearity right there. Small change at the input (between meeting the target and just missing), potentially huge change to the output (the bonus payment). This will encourage weird behaviours – towards the deadline date, a salesman who’s nearly met his target will cut a different deal to one he would’ve done at the start of hte month. Possibly to the detriment of the long term interests of his company. But to his personal gain short-term. Or a banker might be encouraged to take a riskier bet on some market or other, because she knows it will looks good towards the measures that her bonus depends on. Not necessarily in the long-term interests of the company. Or possibly the country… Taxation can also produce these sorts of behaviours – if there’s step changes (like to a 50% tax rate at a particular income), people will suddenly feel it worthwhile to try and avoid that last bit of tax. Whereas a continual change, or even a pure flat rate, will provide less incentive, as there aren’t opportunities for big “savings” for a relatively small amount of effort.

I’m not convinced by bonus schemes as motivators at the best of times, but a linear scheme, where the return is stricly proportional to the gain you’ve given the company, preferably over the long-term, will (IMHO :) provide much more sensible behaviour.

Maybe those specifying bonus schemes as a method of “control” (although I’m sure they’d it hate to be described as such) could learn a bit from control engineering?

# Sensible defaults

Last night I was woken up at midnight by the bleeping noise from downstairs. I blearily wandered down in search of it…

It turned out to (as I’m sure you’d already surmised) to be clock radio in the spare room. It had inadvertently been prodded into alarm mode. And the default alarm time is 12:00am. Bah! Surely it wouldn’t have been too much effort for the firmware designers to initialise the alarm time to 7:00am. I cannot believe that there’s so little spare in a clock radio processor (I’m assuming it’s software…, but even if it’s gates, the same applies) that a few extra initialising loads are out of the question!

If I ever have to write firmware for a clock – the alarm will default to something less irritating!

# How many languages does an engineer need?

Much of an electronic engineer’s life is spent in front of a screen. And much of that time is creating and debugging designs using some textual format or other. Arguably, even humble Excel falls into that category of “tools”. At some point many electronic designers will end up performing schematic capture and maybe PCB layout – but much of the design is already done by then.

So, what languages are used?

Having no knowledge of C programming will hamper one’s ability to bring up new board with a microprocessor of any description included on them – and it’s a rare system which has no micro at all! [Forth][http://en.wikipedia.org/wiki/C_(programming_language)] has several vocal proponents, but it’s definitely a minority language.

If you do logic design then either VHDL or (System) Verilog is a must – very few schematic tools exist in this space. And the tools with which these designs are developed, tested and debugged tend to be controlled by the tool control languageTCL. TCL is a weird language – to me it feels dominated by the form of the source code, woe betide anyone who puts a brace on the wrong line, or omits a space after what (in most other languages) would be a separator charactre in its own right.

Personally, I’d rather use Python if writing scripts, and indeed I use Python frequently throughout my various roles, for quick hacks of data analysis through to a prototype stereo-vision algorithm. Perl is another good choice for quickly hacking together the tools that you need. I find it too easy to write unmaintainable code in Perl however, which is why I have moved to Python.

Matlab is another must for engineers who have to deal with numbers (like signal processing!). Or one of its free competitors Octave or Scilab. comp.dsp regular Vladimir Vassilevsky is fond of ironically saying “Matlab does all thinking for us” when a request is made for help implementing a system which has been designed in Matlab without thought for the real world. However, in the right hands, Matlab is a powerful tool for investigating engineering trade-offs very quickly. I’m fond of using Matlab to investigate algorithmic possibilities, always keeping in mind what the end implementation might look like (in software or hardware). Once I have a good algorithm prototyped, I can then create Matlab implementations which are degraded in various ways to emulate how my real implementation will look. For example, using limited bit widths rather than double-precision numbers. Or approximate square-root algorithms rather than numerically precise answers. If the degraded algorithm differs too much from the golden starting point, I need to undegrade it some! Once this is done, it is then useful for testing my real implementation. If it differs from the prototype degraded algorithm, I’ve done something wrong in the implementation somewhere.

# Scales for engineers

If you’re a professional (or even proficient amateur) musician, you practise your scales, and other exercises daily. This keeps your muscle-memory alive for the basics of how they will move when playing pieces of real music.

What’s the equivalent for engineers?

Maybe mouse and keyboard exercises to keep our hands as productive as possible? Or mental exercises to ensure that the critical facts we need to know are to the fore. I can’t imagine not being able have instant recall of my multiplication tables for example.

And of course this changes over time as more and more mundane stuff is done for me by my computer. A couple of decades ago, a logic designer would have been using Karnaugh maps on a daily basis. I remember “using” them at university in coursework, and I got better, the more I did them./ But I haven’t used them in the real world, despite doing logic design for well over a decade!

Update:
And I’ve found someone with (at least) one answer