Algorithm

For many people, the word “algorithm” evokes the arcane and inscrutable machinations of big data, big government, and big business: increasingly part of the infrastructure of the modern world, but hardly a source of practical wisdom or guidance for human affairs. But an algorithm is just a finite sequence of steps used to solve a problem, and algorithms are much broader—and older by far—than the computer. Long before algorithms were ever used by machines, they were used by people.

The word “algorithm” comes from the name of Persian mathematician al-Khwārizmī, author of a ninth-century book of techniques for doing mathematics by hand. (His book was called al-Jabr wa’l-Muqābala—and the “al-jabr” of the title in turn provides the source of our word “algebra.”) The earliest known mathematical algorithms, however, predate even al-Khwārizmī’s work: a four-thousand-year-old Sumerian clay tablet found near Baghdad describes a scheme for long division.

But algorithms are not confined to mathematics alone. When you cook bread from a recipe, you’re following an algorithm. When you knit a sweater from a pattern, you’re following an algorithm. When you put a sharp edge on a piece of flint by executing a precise sequence of strikes with the end of an antler—a key step in making fine stone tools—you’re following an algorithm. Algorithms have been a part of human technology ever since the Stone Age.

From the book Algorithms to live by Brain Christian and Tom Grifiths

The Best Debugging Tool of All

Rob Pike (one of the authors of the Go language) has this to say on debugging…..:

A year or two after I’d joined the Labs, I was pair programming with Ken Thompson on an on-the-fly compiler for a little interactive graphics language designed by Gerard Holzmann. I was the faster typist, so I was at the keyboard and Ken was standing behind me as we programmed. We were working fast, and things broke, often visibly—it was a graphics language, after all. When something went wrong, I’d reflexively start to dig in to the problem, examining stack traces, sticking in print statements, invoking a debugger, and so on. But Ken would just stand and think, ignoring me and the code we’d just written. After a while I noticed a pattern: Ken would often understand the problem before I would, and would suddenly announce, “I know what’s wrong.” He was usually correct. I realized that Ken was building a mental model of the code and when something broke it was an error in the model. By thinking about *how* that problem could happen, he’d intuit where the model was wrong or where our code must not be satisfying the model.

Ken taught me that thinking before debugging is extremely important. If you dive into the bug, you tend to fix the local issue in the code, but if you think about the bug first, how the bug came to be, you often find and correct a higher-level problem in the code that will improve the design and prevent further bugs.

I recognize this is largely a matter of style. Some people insist on line-by-line tool-driven debugging for everything. But I now believe that thinkingwithout looking at the code—is the best debugging tool of all, because it leads to better software.

Books in 2016

Book reading wise, 2016 was great. Learning new things, it was fantastic. Everything else, it was so so.

Here’s the word cloud of the book names. Looks like a brain. Really nice!

Here’s the full list of the books. A mix bunch of books but unfortunately not many on investing that I would have liked to see.

Data a love story

Amy Webb

59 seconds think a little, change a lot

Richard Wiseman

The Reluctant Mr Darwin

David Quammen

Darwin among machines

George Dyson

Math Geek: From Klein Bottles to chaos theory

Rosen, Raphael 

Letters from a Father to His Son Entering College

Charles Franklin Thwing

   
   

Guns, germs and Steel

Jared Diamond

Give and Take

Adam Grant

Packing for Mars

Mary Roach

The ISIS  apocalypse

Willam McCants

Respecting truth

Lee McIntyre

Why sex is fun

Jared Diamond 

Deep work

Cal Newport 

When to Rob a Bank

Steven D. Levitt

   
   

Curious

Ian Leslie

The Making of the Fittest

Sean B. Carroll

Eating Animals

Jonathan Safran Foer

The Innovator’s Dilemma

Clayton M. Christensen

Value Investing: A Value Investor’s Journey Through The Unknown

Neely, J. Lukas

The Tell-Tale Brain

Ramachandran, V. S.

Black Box Thinking

Matthew Syed

Where Good Ideas Come From

Steven Johnson

   

The Idea Factory: Bell Labs and the Great Age of American Innovation

Gertner, Jon

The Wisest One in the Room

Thomas Gilovich and Lee Ross

Smarter Faster Better

Duhigg, Charles

I Invented the Modern Age: The Rise of Henry Ford

Richard Snow

Pebbles of Perception

Laurence Endersen

The black swan

Nicolas N Taleb

The human advantage

Suzana Herculano-Houze

Concorde

Jonathan Glancey

Food Rules

Michael Pollan

   

Made to Stick

Chip Heath

A Survival Guide to the Misinformation Age: Scientific Habits of Mind

David J. Helfand

Peak: Secrets from the New Science of Expertise

Anders Ericsson

The Everything Store: Jeff Bezos and the Age of Amazon

Stone, Brad

Brain Bugs

Dean Buonomano

The 5 Mistakes Every Investor Makes and How to Avoid Them

Peter Mallouk

Fooled by Randomness

Nassim Nicholas Taleb

A Little History of Science

William Bynum

   
   

Traffic: Why We Drive the Way We Do (And What It Says About Us)

Tom Vanderbilt

Bounce

Matthew Syed

The Halo effect

Phil Rosenzweig

Methods of Persuasion

Kolenda, Nick

Warren Buffett’s Ground Rules

Jeremy C. Miller

I Don’t

Susan Squire

   
   

Ego Is the Enemy

Ryan Holiday

How to teach quantum mechanics to your dog

Chad Orzel

THE RISE AND FALL OF THE THIRD CHIMPANZEE: EVOLUTION AND HUMAN LIFE

Jared Mason Diamond

The Making of the Atomic Bomb

Richard Rhodes

Nudge: Improving Decisions About Health, Wealth, and Happiness

Richard H. Thaler

   
   

The art of doing twice the work in half the time

Jeff Sutherland 

The Ascent of money

Niall Ferguson

Decisive

Chip and Dan Heath

The Red Queen

Matt Ridley

Creativity Inc.

Ed Catmull

   

Move your bus

Ron Clark

Concentrated investing

Allen C Belleno 

Originals

Adam Grant

   

Sapiens

Yuval Noah Harari

Do gentlemen really prefer blondes

Jena Pencott

   

The memory code

Lynne Kelly

The evolution of everything

Mat Ridley

How Not to Be Wrong : The Power of Mathematical Thinking

Jordan Ellenberg 

Bad Science

Linda Gimmerman

Eureka How inventions happen

Gavin Weightman

How memory works

Robert Madigan

From the Big Bang to Your Cells: The Remarkable Story of Minerals

Raye Kane

   

Naked money

Charles wheels

On intelligence

Jeff Hawkins

One to nine

Andrew Hodges

Influence

Robert Cialdini

   
   

Inferno

Dan Brown

Stargazers

Allan Chapman

100 baggers

Christopher Mayer

What technology wants

Kevin Kelly

Homo Deus

Yuval Noah Harari

 

 

Updated Binary STL File Reader

bent_plate

A recent comment on the post titled binary stl file reader in numpy prompted me to revisit the code and while I was looking at it , I updated the same for code to view the loaded stl file.

The above picture shows the result.

The code as usual available on github from here

from binarySTLreader import BinarySTL,ShowSTLFile
h,p,n,v1,v2,v3=BinarySTL('bent_plate.stl')
ShowSTLFile(v1,v2,v3)

What are you favourite data science books?

think_bayes

Recently someone on my the blog asked me about some book ideas on data science and today I saw this post on ebooks available on data science, so here you go.

Please visit the link for the freely available ebooks. http://blog.paralleldots.com/data-scientist/list-must-read-books-data-science/

A good shout out for the good folks at parralledots. (I like the name!!)

My pick, definitely the think Bayes, and think stats. What are your favourite data science books?

 

The Need for Numerical Computation

Have been doing numerical computation ever since I am in the industry but somehow never got to read the following essay. Found it by chance on google. 

A must read for anyone interested in Numerical computation

The Need for Numerical Computation

Everyone knows that when scientists and engineers need numerical answers to mathematical problems, they turn to computers. Nevertheless, there is a widespread misconception about this process. The power of numbers has been extraordinary. It is often noted that the scientific revolution was set in motion when Galileo and others made it a principle that everything must be measured. Numerical measurements led to physical laws expressed mathematically, and, in the remarkable cycle whose fruits are all around us, finer measurements led to refined laws, which in turn led to better technology and still finer measurements.

The day has long since passed when an advance in the physical sciences could be achieved, or a significant engineering product developed, without numerical mathematics.

Computers certainly play a part in this story, yet there is a misunderstanding about what their role is. Many people imagine that scientists and mathematicians generate formulas, and then, by inserting numbers into these formulas, computers grind out the necessary results. The reality is nothing like this.

 

Read more by following this link.  http://people.math.umass.edu/~johnston/M552S16/NAessay.pdf

 
 

 

 
 

Machine learning: Thou aimest high.

Was reading the book Why Nation fails by by Daron Acemoglu and found this anecdote. This reminded me of similar thoughts, many have advocated about artificial intelligence and machine learning.

In 1583 William Lee returned from his studies at the University of Cambridge to become the local priest in Calverton England. Elizabeth I (1558–1603) had recently issued a ruling that her people should always wear a knitted cap. Lee recorded that “knitters were the only means of producing such garments but it took so long to finish the article. I began to think. I watched my mother and my sisters sitting in the evening twilight plying their needles. If garments were made by two needles and one line of thread, why not several needles to take up the thread.”

This momentous thought was the beginning of the mechanization of textile production. Lee became obsessed with making a machine that would free people from endless hand-knitting. He recalled, “My duties to Church and family I began to neglect. The idea of my machine and the creating of it ate into my heart and brain.” Finally, in 1589, his “stocking frame” knitting machine was ready. He travelled to London with excitement to seek an interview with Elizabeth I to show her how useful the machine would be and to ask her for a patent that would stop other people from copying the design.

He rented a building to set the machine up and, with the help of his local member of Parliament Richard Parkyns, met Henry Carey, Lord Hundson, a member of the Queen’s Privy Council. Carey arranged for Queen Elizabeth to come see the machine, but her reaction was devastating. She refused to grant Lee a patent, instead observing, “Thou aimest high, Master Lee. Consider thou what the invention could do to my poor subjects. It would assuredly bring to them ruin by depriving them of employment, thus making them beggars.”

Crushed, Lee moved to France to try his luck there; when he failed there, too, he returned to England, where he asked James I (1603–1625), Elizabeth’s successor, for a patent. James I also refused, on the same grounds as Elizabeth. Both feared that the mechanization of stocking production would be politically destabilizing. It would throw people out of work, create unemployment and political instability, and threaten royal power. The stocking frame was an innovation that promised huge productivity increases, but it also promised creative destruction.

I am not smart enough to know if the apprehensions are right or wrong, but as a machine learning enthusiast, I am fascinated with the field. 

Deep learning, AI, ML are tools like knife and hammer that we are now beginning to understand better and put them to practical use.
Exciting times.