Algorithm

For many people, the word “algorithm” evokes the arcane and inscrutable machinations of big data, big government, and big business: increasingly part of the infrastructure of the modern world, but hardly a source of practical wisdom or guidance for human affairs. But an algorithm is just a finite sequence of steps used to solve a problem, and algorithms are much broader—and older by far—than the computer. Long before algorithms were ever used by machines, they were used by people.

The word “algorithm” comes from the name of Persian mathematician al-Khwārizmī, author of a ninth-century book of techniques for doing mathematics by hand. (His book was called al-Jabr wa’l-Muqābala—and the “al-jabr” of the title in turn provides the source of our word “algebra.”) The earliest known mathematical algorithms, however, predate even al-Khwārizmī’s work: a four-thousand-year-old Sumerian clay tablet found near Baghdad describes a scheme for long division.

But algorithms are not confined to mathematics alone. When you cook bread from a recipe, you’re following an algorithm. When you knit a sweater from a pattern, you’re following an algorithm. When you put a sharp edge on a piece of flint by executing a precise sequence of strikes with the end of an antler—a key step in making fine stone tools—you’re following an algorithm. Algorithms have been a part of human technology ever since the Stone Age.

From the book Algorithms to live by Brain Christian and Tom Grifiths

The Best Debugging Tool of All

Rob Pike (one of the authors of the Go language) has this to say on debugging…..:

A year or two after I’d joined the Labs, I was pair programming with Ken Thompson on an on-the-fly compiler for a little interactive graphics language designed by Gerard Holzmann. I was the faster typist, so I was at the keyboard and Ken was standing behind me as we programmed. We were working fast, and things broke, often visibly—it was a graphics language, after all. When something went wrong, I’d reflexively start to dig in to the problem, examining stack traces, sticking in print statements, invoking a debugger, and so on. But Ken would just stand and think, ignoring me and the code we’d just written. After a while I noticed a pattern: Ken would often understand the problem before I would, and would suddenly announce, “I know what’s wrong.” He was usually correct. I realized that Ken was building a mental model of the code and when something broke it was an error in the model. By thinking about *how* that problem could happen, he’d intuit where the model was wrong or where our code must not be satisfying the model.

Ken taught me that thinking before debugging is extremely important. If you dive into the bug, you tend to fix the local issue in the code, but if you think about the bug first, how the bug came to be, you often find and correct a higher-level problem in the code that will improve the design and prevent further bugs.

I recognize this is largely a matter of style. Some people insist on line-by-line tool-driven debugging for everything. But I now believe that thinkingwithout looking at the code—is the best debugging tool of all, because it leads to better software.

Updated Binary STL File Reader

bent_plate

A recent comment on the post titled binary stl file reader in numpy prompted me to revisit the code and while I was looking at it , I updated the same for code to view the loaded stl file.

The above picture shows the result.

The code as usual available on github from here

from binarySTLreader import BinarySTL,ShowSTLFile
h,p,n,v1,v2,v3=BinarySTL('bent_plate.stl')
ShowSTLFile(v1,v2,v3)

What are you favourite data science books?

think_bayes

Recently someone on my the blog asked me about some book ideas on data science and today I saw this post on ebooks available on data science, so here you go.

Please visit the link for the freely available ebooks. http://blog.paralleldots.com/data-scientist/list-must-read-books-data-science/

A good shout out for the good folks at parralledots. (I like the name!!)

My pick, definitely the think Bayes, and think stats. What are your favourite data science books?

 

The Need for Numerical Computation

Have been doing numerical computation ever since I am in the industry but somehow never got to read the following essay. Found it by chance on google. 

A must read for anyone interested in Numerical computation

The Need for Numerical Computation

Everyone knows that when scientists and engineers need numerical answers to mathematical problems, they turn to computers. Nevertheless, there is a widespread misconception about this process. The power of numbers has been extraordinary. It is often noted that the scientific revolution was set in motion when Galileo and others made it a principle that everything must be measured. Numerical measurements led to physical laws expressed mathematically, and, in the remarkable cycle whose fruits are all around us, finer measurements led to refined laws, which in turn led to better technology and still finer measurements.

The day has long since passed when an advance in the physical sciences could be achieved, or a significant engineering product developed, without numerical mathematics.

Computers certainly play a part in this story, yet there is a misunderstanding about what their role is. Many people imagine that scientists and mathematicians generate formulas, and then, by inserting numbers into these formulas, computers grind out the necessary results. The reality is nothing like this.

 

Read more by following this link.  http://people.math.umass.edu/~johnston/M552S16/NAessay.pdf

 
 

 

 
 

The origins of Internet of things

Anywhere you look in the engineering domain, people are talking about IoT (Internet of Things). All OEM’s and service providers to OEM and many start-ups are looking at this new thing.

But is it really new?

It turns out no. Its origin goes back to 1999. Today the focus is new, mostly because of the possibility of using advancing machine learning on the data generated with the sensors.

So if you are interested in IoT, the following short article by Professor Duncan McFarlane should be an interesting read.

Most of you have probably heard the Internet of Things, or the IoT, mentioned but have you ever wondered what it means and where it all began?

Well here’s my version of it:

In 1999, the Auto-ID Centre was founded, which subsequently formed a unique partnership between around 100 international companies and 7 of the world’s leading research Universities, including the MIT and University of Cambridge. Kevin Ashton, Professor Sanjay Sarma and David Brock were the early co-founders and I became involved as European Research Director a year later setting up the European side of things and pushing the industrial research.

The Auto-ID Centre’s aim was to investigate and understand what came next after the barcode – and particularly what an electronic barcode would look like. Sanjay came to see me in Cambridge in March, 2000.

We discussed barcodes and RFID as an electronic replacement and I think my initial comment was that it all seemed a reasonably dull research activity! I was of course later forced to eat my words as the project expanded but also in our research we realised that RFID was actually a solution to a manufacturing control problem we had been trying to resolve – how to establish an Internet connection for parts and products while they were being made.

The focus of the Centre from the beginning was to research ways in which an electronic tag could be put on every single object in the world, allowing each to be uniquely tracked and potentially controlled – and to do so in the most cost effective way. We realised that to make RFID cheap we needed the smallest chip possible – Silicon was/is expensive – and thus we needed to put all stored data in memory elsewhere. The Internet was the obvious place to start, hence the phrase “Internet of Objects” or “Internet of Things” became a clear reference point and the origin of the internet of things that we refer to today. Believe the term “Internet of Things” was in fact coined by Kevin Ashton in 1999 during a presentation he made at P&G.

Read the full article