At one of his Berkshire Hathaway annual meetings, Warren Buffett said:
“The most important investment you can make is in yourself. Very few people get anything like their potential horsepower translated into the actual horsepower of their output in life. Potential exceeds realization for many people…The best asset is your own self. You can become to an enormous degree the person you want to be.”
From the book Foundations of Data Science by John Hopcroft and Ravindran Kannan
Computer science as an academic discipline began in the 60’s. Emphasis was on programming languages, compilers, operating systems, and the mathematical theory that supported these areas. Courses in theoretical computer science covered finite automata,regular expressions, context free languages, and computability.
In the 70’s, algorithms was added as an important component of theory. The emphasis was on making computers useful. Today, a fundamental change is taking place and the focus is more on applications.
There are many reasons for this change. The merging of computing and communications has played an important role. The enhanced ability to observe, collect and store data in the natural sciences, in commerce, and in other fields calls for a change in our understanding of data and how to handle it in the modern setting. The emergence of the web and social networks, which are by far the largest such structures, presents both opportunities and challenges for theory
All this entails, there is the switch from discrete mathematics to more emphasis on probability and statistics.
Author David Foster Wallace was asked to give the commencement address to the 2005 graduating class of Kenyon College. This captivating short film is, without a doubt, inspiring and thought provoking.
Today I am releasing a simple module to create joint plot with Matplotlib on github. Joint plot is available in the excellent seaborn library but unfortunately it’s not always available on many systems. Recently I needed this functionality, so wrote this simple module with matplotlib.
The functionality is almost similar to seaborn but with limited feature. This has helped me in my work, releasing it in the hope that others might find it useful.
Find the code at this github repository.
Anywhere you look in the engineering domain, people are talking about IoT (Internet of Things). All OEM’s and service providers to OEM and many start-ups are looking at this new thing.
But is it really new?
It turns out no. Its origin goes back to 1999. Today the focus is new, mostly because of the possibility of using advancing machine learning on the data generated with the sensors.
So if you are interested in IoT, the following short article by Professor Duncan McFarlane should be an interesting read.
Most of you have probably heard the Internet of Things, or the IoT, mentioned but have you ever wondered what it means and where it all began?
Well here’s my version of it:
In 1999, the Auto-ID Centre was founded, which subsequently formed a unique partnership between around 100 international companies and 7 of the world’s leading research Universities, including the MIT and University of Cambridge. Kevin Ashton, Professor Sanjay Sarma and David Brock were the early co-founders and I became involved as European Research Director a year later setting up the European side of things and pushing the industrial research.
The Auto-ID Centre’s aim was to investigate and understand what came next after the barcode – and particularly what an electronic barcode would look like. Sanjay came to see me in Cambridge in March, 2000.
We discussed barcodes and RFID as an electronic replacement and I think my initial comment was that it all seemed a reasonably dull research activity! I was of course later forced to eat my words as the project expanded but also in our research we realised that RFID was actually a solution to a manufacturing control problem we had been trying to resolve – how to establish an Internet connection for parts and products while they were being made.
The focus of the Centre from the beginning was to research ways in which an electronic tag could be put on every single object in the world, allowing each to be uniquely tracked and potentially controlled – and to do so in the most cost effective way. We realised that to make RFID cheap we needed the smallest chip possible – Silicon was/is expensive – and thus we needed to put all stored data in memory elsewhere. The Internet was the obvious place to start, hence the phrase “Internet of Objects” or “Internet of Things” became a clear reference point and the origin of the internet of things that we refer to today. Believe the term “Internet of Things” was in fact coined by Kevin Ashton in 1999 during a presentation he made at P&G.
Read the full article
Which language should I learn for computational science? Via this
- Start off directly using the numpy library, small scripts, and the ipython interactive shell.
- Get more advanced with the help of numerous free books and tutorials.
- Get more productive using scipy as a frontend to highly performant numerical routines and matplotlib for visualization
- Take advantage of well developed and powerful modules for scientific computing as Krypy, FeNiCS and lots of others
- Notice, that the smooth transition between flat and object oriented programming and the inherent modularity of Python make larger projects easy to handle.
- Make your code as fast as C or Fortran by simply rewriting critical parts in cython. You can also easily include routines written in Fortran or C.
This paraphrases what I think is the best way to approach a problem in scientific computing. Start with getting a hand on the problem by playing around with toy examples in small scripts. Become more systematic and set up a suite of code. Then make your code work!!! Finally, if necessary, do code optimization. Don’t reinvent the wheel and don’t do premature optimization.
This is the exact way I have worked in the past and this is what I recommend to anyone who asks.