Back in the day, how to do trigonometry

I’ve been using microprocessors since I was in college in the late 70’s. One of the (many) things that make me smile is the present need to have Ghz processors to do trigonometry things.

This link ( talks about the 8087 math co-processor, which for the times was pretty amazing. One of the secrets was the CORDIC design. (

Some hairy math stuff at the front, but some very simple code at the bottom so scroll down.

This is an example of “work smarter not harder”. Back in my mainframe days when mainframes that supported 100 people had less power than your cell phone, things like this made some serious science still possible.

It’s well worth a look on how some lookup tables and some math can work to your advantage.


I never knew about this processor until today. Thats very intuitive of Intel to do, yet so simple at the same time. Major props!

I knew about CORDIC, but I thought it was all software. Very interesting.

So I was (still am) a Intel MCS-52 Basic fan of the Intel 8052 CPU. It’s super slow by today’s standards, but it was a very powerful chip that let you build a 6-7 chip system running at 14 Mhz. It had trig functions built in, a key lifter to micro mouse projects,

@Drew2158U - it’s all software, but using the ROM to store constants let you really use the program space, the ROM space and the RAM space to their best usage. ROM space has faster access time vs RAM space. Being able to split and use spaces was key was a big deal to bit banging hard core code. The PIC family had some CPU support that also made tring functions easy with table lookup.

And that brings my tale to the end, the PIC processor was in the VEX 0.5 processor. People like @Quazar had libraries that let us do some hard core math on small CPU. I often think that classes on how to make low power CPU’s do big things would translate to better code on more powerful boxes.


What you are describing is akin to “Mechanical Sympathy”. While I came to understand it thru software development, the term is more broadly based (I think it may have origins in car racing).

Anyways, I like to frequently go back and read the origin stories of the LMAX Disruptor implementation. It describes how well the developers understood the hardware that would run their software, and designed the software around what the hardware did well, while avoiding what the hardware did poorly.

A good read:

Martin Fowler also describes it (long):